Abstract | We show that by changing the grammar of the formal meaning representation language and training on additional data collected from Amazon’s Mechanical Turk we can further improve the results. |
Collecting Additional Data with Mechanical Turk | We validate this claim by collecting additional training data for the navigation domain using Mechanical Turk (Snow et al., 2008). |
Collecting Additional Data with Mechanical Turk | Thus, we created two tasks on Mechanical Turk . |
Collecting Additional Data with Mechanical Turk | The instructions used for the follower problems were mainly collected from our Mechanical Turk instructor task with some of the instructions coming from data collected by MacMahon (2007) that was not used by Chen and Mooney (2011). |
Conclusion | In addition, we showed that changing the MRG and collecting additional training data from Mechanical Turk further improve the performance of the overall navigation system. |
Experiments | In addition to SGOLL and SGOLL with the new MRG, we also look at augmenting each of the training splits with the data we collected using Mechanical Turk . |
Experiments | training data with additional instructions and follower traces collected from Mechanical Turk produced the best results. |
Experiments | Even after incorporating the new Mechanical Turk data into the training set, SGOLL still takes much less time to build a lexicon. |
Introduction | gorithm can scale to larger datasets, we present results on collecting and training on additional data from Amazon’s Mechanical Turk . |