0

I want to send money between different PaypalCustomerAccount, every user's transaction would be recorded in UserTransaction table. Example in the photo, PaypalCustomerAccount user_id rmMe7a68kOXIvblu4aah1ZHc7Qx2. When she makes a transaction it will be saved into UserTransaction table with its user_id with a new transaction_id -LOvXSpmHnjmN2sWkhap. enter image description here

I want to use a chatbot from DialogFlow integrated in Android Studio to send money between the PaypalCustomerAccount.

In the screenshot is how to android app chatbot looks like, when user wants to send a "whoosh"(digital cheque), the conversation goes:

  • send a whoosh Sure
  • who would u like to send it to

enter image description here

  • send it to susie
  • May i know the post date for the cheque

enter image description here

  • send it tmr
  • Alright, ive sent a whoosh to susie it will be processed by 2018-10-17

enter image description here

Now after the user makes the request, i want to save the request of the new transaction to my firebase. From PaypalCustomerAccount rmMe7a68kOXIvblu4aah1ZHc7Qx2 to a2u4aqw3Hc7Qx2 . Meaning a new Transaction record will appear under UserTransaction:

  1. for a2u4aqw3Hc7Qx2 and the status will be receive.
  2. for rmMe7a68kOXIvblu4aah1ZHc7Qx2 status will be sent.

The follow below is index.js code in fulfilment page for Dialog Flow. I do not know how to write the codes in order for the above described to happen (Meaning a new Transaction record will appear under UserTransaction) Do i write the codes in dialog flow or in my android studio (the backend for chatbot code) for that to happen?

'use strict';

const functions = require('firebase-functions');
const {WebhookClient} = require('dialogflow-fulfillment');
const {Card, Suggestion} = require('dialogflow-fulfillment');

// initialise DB connection
const admin = require('firebase-admin');
admin.initializeApp({
  credential: admin.credential.applicationDefault(),
  databaseURL: 'ws://whooshapplication.firebaseio.com/',
});


process.env.DEBUG = 'dialogflow:debug'; // enables lib debugging statements

exports.dialogflowFirebaseFulfillment = functions.https.onRequest((request, response) => {
  const agent = new WebhookClient({ request, response });
  console.log('Dialogflow Request headers: ' + JSON.stringify(request.headers));
  console.log('Dialogflow Request body: ' + JSON.stringify(request.body));

  function SendWhoosh(agent) 
  {
     const nameParam = agent.parameters.name;
     const context = agent.getContext('send_a_whoosh');
      const name = nameParam || context.parameters.name;


   // push input into db 

  }

  function fallback(agent) {
    agent.add(`I didn't understand`);
    agent.add(`I'm sorry, can you try again?`);
}

  // // Uncomment and edit to make your own intent handler
  // // uncomment `intentMap.set('your intent name here', yourFunctionHandler);`
  // // below to get this function to be run when a Dialogflow intent is matched
  // function yourFunctionHandler(agent) {
  //   agent.add(`This message is from Dialogflow's Cloud Functions for Firebase editor!`);
  //   agent.add(new Card({
  //       title: `Title: this is a card title`,
  //       imageUrl: 'https://developers.google.com/actions/images/badges/XPM_BADGING_GoogleAssistant_VER.png',
  //       text: `This is the body text of a card.  You can even use line\n  breaks and emoji! `,
  //       buttonText: 'This is a button',
  //       buttonUrl: 'https://assistant.google.com/'
  //     })
  //   );
  //   agent.add(new Suggestion(`Quick Reply`));
  //   agent.add(new Suggestion(`Suggestion`));
  //   agent.setContext({ name: 'weather', lifespan: 2, parameters: { city: 'Rome' }});
  // }

  // // Uncomment and edit to make your own Google Assistant intent handler
  // // uncomment `intentMap.set('your intent name here', googleAssistantHandler);`
  // // below to get this function to be run when a Dialogflow intent is matched
  // function googleAssistantHandler(agent) {
  //   let conv = agent.conv(); // Get Actions on Google library conv instance
  //   conv.ask('Hello from the Actions on Google client library!') // Use Actions on Google library
  //   agent.add(conv); // Add Actions on Google library responses to your agent's response
  // }
  // // See https://github.com/dialogflow/dialogflow-fulfillment-nodejs/tree/master/samples/actions-on-google
  // // for a complete Dialogflow fulfillment library Actions on Google client library v2 integration sample

  // Run the proper function handler based on the matched Dialogflow intent name
[enter image description here][1]  let intentMap = new Map();
  intentMap.set('Default Welcome Intent', welcome);
  intentMap.set('Default Fallback Intent', fallback);
  // intentMap.set('your intent name here', yourFunctionHandler);
  // intentMap.set('your intent name here', googleAssistantHandler);
  agent.handleRequest(intentMap);
});

enter image description here

This is my android studio ChatBot.java codes for the chatbot to work

public class ChatBot extends AppCompatActivity implements AIListener
{
    public Bot bot;
    public static Chat chat;

    private ListView mListView;
    private Button mButtonSend;
    private EditText mEditTextMessage;
    private Button mImageView;
    private ChatMessageAdapter mAdapter;

    AIService aiService;

    private static final String TAG = "ChatBot";

    @Override
    protected void onCreate(Bundle savedInstanceState)
    {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_chat_bot);

        mListView = (ListView) findViewById(R.id.listView);
        mImageView = (Button)findViewById(R.id.voice_record);
        mButtonSend = (Button) findViewById(R.id.btn_send);
        mEditTextMessage = (EditText) findViewById(R.id.et_message);
        mAdapter = new ChatMessageAdapter(this, new ArrayList<ChatMessage>());
        mListView.setAdapter(mAdapter);

        int permission = ContextCompat.checkSelfPermission(this,
                Manifest.permission.RECORD_AUDIO);

        if (permission != PackageManager.PERMISSION_GRANTED)
        {
            Log.i(TAG, "Permission to record denied");
            makeRequest();
        }

        final AIConfiguration config = new AIConfiguration("c43d5450b1a54959a44158fb897f1dcb",
                AIConfiguration.SupportedLanguages.English,
                AIConfiguration.RecognitionEngine.System);

        aiService = AIService.getService(this, config);
        aiService.setListener(this);



        mButtonSend.setOnClickListener(new View.OnClickListener()
        {
            @Override
            public void onClick(View v)
            {
                String message = mEditTextMessage.getText().toString();
                //bot
                String response = chat.multisentenceRespond(mEditTextMessage.getText().toString());
                if (TextUtils.isEmpty(message))
                {
                    return;
                }
                sendMessage(message);
                mimicOtherMessage(response);
                mEditTextMessage.setText("");
                mListView.setSelection(mAdapter.getCount() - 1);
            }
        });

        mImageView.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                aiService.startListening();
            }
        });

        // *************************************

        //checking SD card availablility
        boolean a = isSDCARDAvailable();
        //receiving the assets from the app directory
        AssetManager assets = getResources().getAssets();
        File jayDir = new File(Environment.getExternalStorageDirectory().toString() + "/cheque/bots/whoosh");
        boolean b = jayDir.mkdirs();
        if (jayDir.exists())
        {
            //Reading the file
            try {
                for (String dir : assets.list("whoosh")) {
                    File subdir = new File(jayDir.getPath() + "/" + dir);
                    boolean subdir_check = subdir.mkdirs();
                    for (String file : assets.list("whoosh/" + dir)) {
                        File f = new File(jayDir.getPath() + "/" + dir + "/" + file);
                        if (f.exists())
                        {
                            continue;
                        }
                        InputStream in = null;
                        OutputStream out = null;
                        in = assets.open("whoosh/" + dir + "/" + file);
                        out = new FileOutputStream(jayDir.getPath() + "/" + dir + "/" + file);
                        //copy file from assets to the mobile's SD card or any secondary memory
                        copyFile(in, out);
                        in.close();
                        in = null;
                        out.flush();
                        out.close();
                        out = null;
                    }
                }
            } catch (IOException e)
            {
                e.printStackTrace();
            }
        }
        //get the working directory
        MagicStrings.root_path = Environment.getExternalStorageDirectory().toString() + "/cheque";
        System.out.println("Working Directory = " + MagicStrings.root_path);
        AIMLProcessor.extension =  new PCAIMLProcessorExtension();
        //Assign the AIML files to bot for processing
        bot = new Bot("whoosh", MagicStrings.root_path, "chat");
        chat = new Chat(bot);
        String[] args = null;
        mainFunction(args);

        // *************************************

    } // onCreate

    protected void makeRequest()
    {
        ActivityCompat.requestPermissions(this,
                new String[]{Manifest.permission.RECORD_AUDIO},
                101);
    }

    @Override
    public void onRequestPermissionsResult(int requestCode, String permissions[], int[] grantResults)
    {
        switch (requestCode)
        {
            case 101:
            {
                if (grantResults.length == 0 || grantResults[0] != PackageManager.PERMISSION_GRANTED)
                {
                    Log.i(TAG, "Permission has been denied by user");
                }
                else
                {
                    Log.i(TAG, "Permission has been granted by user");
                }
                return;
            }
        }
    }

    public void voiceclick(View view)
    {
        aiService.startListening();
    }

    //check SD card availability
    public static boolean isSDCARDAvailable()
    {
        return Environment.getExternalStorageState().equals(Environment.MEDIA_MOUNTED)? true :false;
    }

    //copying the file
    private void copyFile(InputStream in, OutputStream out) throws IOException
    {
        byte[] buffer = new byte[1024];
        int read;
        while((read = in.read(buffer)) != -1)
        {
            out.write(buffer, 0, read);
        }
    } //copyFile

    //Request and response of user and the bot
    public static void mainFunction (String[] args)
    {
        MagicBooleans.trace_mode = false;
        System.out.println("trace mode = " + MagicBooleans.trace_mode);
        Graphmaster.enableShortCuts = true;
        Timer timer = new Timer();
        String request = "Hello.";
        String response = chat.multisentenceRespond(request);

        System.out.println("Human: "+request);
        System.out.println("Robot: " + response);
    } //mainFunction

    private void sendMessage(String message)
    {
        ChatMessage chatMessage = new ChatMessage(message, true, false);
        mAdapter.add(chatMessage);
        //respond as Helloworld
        mimicOtherMessage("HelloWorld");
    } //sendMessage

    private void mimicOtherMessage(String message)
    {
        ChatMessage chatMessage = new ChatMessage(message, false, false);
        mAdapter.add(chatMessage);
    }

    private void sendMessage()
    {
        ChatMessage chatMessage = new ChatMessage(null, true, true);
        mAdapter.add(chatMessage);

        mimicOtherMessage();
    }

    private void mimicOtherMessage()
    {
        ChatMessage chatMessage = new ChatMessage(null, false, true);
        mAdapter.add(chatMessage);
    }

    @Override
    public void onResult(AIResponse result)
    {
        Log.d("anu",result.toString());
        Result result1 = result.getResult();
        mEditTextMessage.setText("Query" + result1.getResolvedQuery() + " action: " + result1.getAction());

    }

    @Override
    public void onError(AIError error) {

    }

    @Override
    public void onAudioLevel(float level) {

    }

    @Override
    public void onListeningStarted() {

    }

    @Override
    public void onListeningCanceled() {

    }

    @Override
    public void onListeningFinished() {

    }
}

enter image description here

Here is also a screenshot of my intent in dialogflow

Please help me! and thank you in advance

Doug Stevenson
  • 297,357
  • 32
  • 422
  • 441
Ong Suling
  • 11
  • 4

1 Answers1

0

First of all, Awesome Project on which you are working on, implementing the real-time chatbot transaction management.
I have reviewed each file you have attached and your asked question properly. I want to highlight some important steps that you can follow to implement the asked thing that are:

  1. Checkout and implement Paypal SDK in your Android Application: https://github.com/paypal/PayPal-Android-SDK
  2. Then, using the methods and objects you can simply put the returned values of any transaction from Paypal to your android application and thus, you can pass that on Realtime Database at Firebase.
  3. The Chatbot Code is fine here, just you have to trigger action on particular intent calling of when the user says to send money to someone, to the android application backend where the Paypal code will run in Android Class File.

I hope you have got me! Better to know more, please feel free to ask doubts in reply section.

All the best for your future implementation!

Ramneek
  • 1
  • 3