What is Amazon MTurks

Amazon MTurk

You may want to use the Croudsourcing Marketplace Amazon Mechanical Turk (MTurk). The following explains how to do this.

Note: We have to assume that employee IDs are personal data. Therefore, restrictions from the GDPR (see personal data) may apply. Therefore, speak to your data protection officer prior to application if you intend to use the individual codes or worker IDs.

Show code

According to the official information from Amazon, you should provide an ID at the end of the questionnaire, which the workers must enter in the MTurk form.

One possibility is to display a fixed code if the respondents meet your quality check criteria.

if (caseTime ('begin')> 600) {html ('

The questionnaire code is 12345678

');} else {html ('

We're sorry, You have not filled out this survey carefully enough.

');}

However, you will likely want to provide custom codes to prevent employees from sharing the code. To do this, please follow the instructions for displaying individual codes or voucher codes.

However, the use of the official code solution means a certain amount of additional work when it comes to assigning which workers have earned the HIT and which have not.

Copy the worker ID

If you create a new MTurk task using the template with the “Questionnaire link”, you can change the code to send the worker ID directly to your data set.

MTurk task

In the Amazon MTurk Requester Administration, please select Create → New Project and then start with the template surveyPoll link. click on Create projectto start with the details.

In the first tab of the Amazon MTurk form, please enter all data (title, description, ...)

In the second tab, "Design Layout", click the "Source" button to see the HTML source code of the design.

There are two changes you need to make in the source code. First insert a line at the very beginning under the tag to embed JavaScript (additional line is set in bold).



\\

Next, the following must be added, which writes the survey URL with the appropriate parameters. On the next line, find and replace the content of the tag as follows:




The first parameter in the function is the URL link of your survey.

The second parameter is necessary to display the link.

You can still choose for or against a survey code. If you don't want one, just remove the line:

After you have made these changes, click the "Source" button again and edit the other texts, e.g. B. the study description.

Supplement in SoSci Survey

By default, in the code above, the worker ID is saved as a reference (variable REF). You can also save the assignment ID and the HIT ID.

In SoSci Survey, create a question of the type client. Go to the tab on this question Variables (POST / GET) and enter the following GET keys to be saved:

    Save the question and place it on the first page of your questionnaire.

    Release HITs

    Most likely, you will clean up your record, e.g. B. on the basis of instructed response items (very good), sham items (also good) and / or response times (fair, consider using TIME_RSI).

    You have the worker IDs of any remaining records that meet your quality criteria. Of course, you could manually check all assignments in MTurk, but that takes a long time. Better to work with a CSV file.

    You need the HIT-ID and the assignment ID for the CSV file. If you have followed the instructions above, this data will be available to you. Then create a table (with R, SPSS, LibreOffice Calc, Excel, ...) with three columns from your data: The first column contains the HIT ID, the second column contains the assignment ID and the third column must be for each entry contain an "x".

    The first line of the table is the header, the variable names. Make sure that the variable names are as follows (case-sensitive):

      Then save the table as a CSV file and click in MTurk → ManageReview Results on the button Upload CSV. Select the newly saved CSV and confirm. MTurk will then ask you if you would like to accept these orders.

      When you are done, you should only be left with the assignments that did not meet your quality criteria. Reject them and explain why.

      Alternatively, do not delete the data records immediately during data cleansing, but mark them. You can then upload a CSV file with four columns, the fourth named “Rejection” and with an explanation for the rejection in the column. This is to make sure you don't reject someone whose ID has been lost.