External Labellers

Video Recap

  1. Select Hire Labellers - The Hire Labellers button can be found on the bottom left of the Annotator once you have selected an image to annotate.
  2. Design a Brief - Inputs required are that you give a brief description of your project and describes what types of objects you seek to annotate and what annotation type you require. Other inputs that are required are an approximate budget that you would be willing to use for annotation, as well as at least 8 annotated images on the project.
  3. Cost Estimation - The second step defines the specific requirements of the annotation task. Here, you can choose between three options in terms of annotating pace: ‘Basic’, ‘Standard’, and ‘Priority’. The different options vary in terms of how many annotators are assigned to the project. The slider below is used to allow you to select how many images you want to have annotated. At the very bottom is the estimated cost calculated based on the options you have inputted here.
  4. Review Details - You will be displayed your project description, the number of images that have to be annotated, the pace at which you would like the annotations to be done at, and finally, the estimated total cost. To progress and send in the request, you have to agree to the terms and conditions which are shown in another tab next to the project summary.
  5. See your job on Scheduled Jobs

Starting a Job

Design a Brief

The first step, ‘Design a Brief’ requests that you give a brief description of your project and describes what types of objects you seek to annotate and what annotation type you require. Other inputs that are required are an approximate budget that you would be willing to use for annotation, as well as at least 8 annotated images on the project. The purpose of the 8 annotated images is to provide sample quality and helps to identify objects that are not commonly known or use case specific for the hired labellers. Ideally, the annotated images should be as representative of your dataset as possible. They should show example annotations of all the different types of objects as well as aptly demonstrate the required detail and precision.

2022

Design a Brief (click on the image to enlarge)

Cost Estimation

The second step defines the specific requirements of the annotation task. Here, you can choose between three options in terms of annotating pace: ‘Basic’, ‘Standard’, and ‘Priority’. The different options vary in terms of how many annotators are assigned to the project. The slider below is used to allow you to select how many images you want to have annotated. At the very bottom is the estimated cost calculated based on the options you have inputted here.

2022

Cost Estimation (click on the image to enlarge)

Review Details

Once you are happy with your options, you will be brought to the review page, where you will be displayed your project description, the number of images that have to be annotated, the pace at which you would like the annotations to be done at, and finally, the estimated total cost. To progress and send in the request, you have to agree to the terms and conditions which are shown in another tab next to the project summary. If you have any changes that you would like to have made on previous inputs, you can always return to previous steps. Once your request has been made, the request will automatically be transferred to us, who will then file a formal request with our external labelling partners. An email will be sent as soon as all is confirmed.

2022

Review Details (click on the image to enlarge)

Scheduled Jobs

To see previous jobs that you have scheduled, you can select Scheduled Jobs on the sidebar of the Nexus homepage. There, you will be able to see all jobs, completed and currently pending. For each job, there will be a progress bar showing the annotation job’s level of completion, as well as a status at the top right to show whether it is pending or completed. In the additional settings found at the bottom right, you can export your brief which will show the overview of the request in PDF, mark the job as completed, or contact the team for additional help.0


Common Questions

How is the cost of external labelling requests calculated?

There are a few dimensions that are used to calculate the cost of a task: difficulty, scale, and speed. Difficulty covers the difficulty of the annotation task. Segmentation masks are much more difficult to annotate well, and are much more time consuming to annotate for the labeller as compared to bounding boxes. Scale considers the average number of annotations per image as well as the number of images needed for annotation. Based on the sample annotations, we try to estimate the total number of annotations needed and base the cost on this. Finally, speed plays a factor in terms of how many labellers will be needed to label the dataset given a certain pace or deadline.