NETS 2130
Sli.do Ed Discussion Perusall Panopto Gradescope Toggle Dark/Light/Auto mode Toggle Dark/Light/Auto mode Toggle Dark/Light/Auto mode Back to homepage
Edit page

Announcements

This is where the announcements will be displayed.

Course website is live!

The NETS 2130 Fall 2024 course website has finally been launched.

It currently includes minimal materials, but it will rapidly be populated.

Homework 1: "Become a Crowd Worker"

Become a Crowd Worker : Assignment 1

The point of the first assignment is to get a worker’s perspective. Crowdsourcing platforms like Mechanical Turk allow us, the programmers, to work with humans as though they are function calls (fulfilling every antisocial CS student’s dreams). Since, for the rest of the semester, you will be relying on crowd workers to complete work for you quickly and accurately, you should take this first assignment as an opportunity to understand your workers and the ways you can help them work best for you. Pay attention to what makes tasks interesting and attractive, how much each task pays or what non-financial incetives are effective, and whether you think the compensation is fair.

We will ask you to turn in a short writeup describing your experience. As inspiration for your writeup, you should read this essay by Jeff Bigham called “My MTurk (half) Workday”.

In this assignment you should try to do work on (at least) two crowd platforms from this list:

  • Mechanical Turk
  • Zooniverse
  • Google Crowdsource
  • CrowdGen
  • Upwork
  • Mercor (with thanks to Avi Upadhyayula!)

Submission Requirements

  • Write about your experience working on the two platforms, spending at least 2-3 hours on each platform.
  • If working with a partner, do the tasks together over Zoom rather than dividing the work.
  • Analyze each platform from both the perspective of a worker and a regulator aiming to increase efficiency.
  • Write as if addressing a student in next year’s NETS 2130 class.
  • There is no length requirement as long as you comprehensively address the questions asked.
  • Include relevant screenshots to clarify your points (e.g., a screenshot of the worker sign-up process when discussing it).
  • Create your write-up in Markdown format, but submit as a PDF on Gradescope (submission details to be provided next week).

Overarching Questions

As you complete the assignment, keep these high-level goals/questions in mind:

  1. Provide concrete examples from your experience to illustrate your points, such as screenshots, quotes, or anecdotes about noteworthy tasks or interactions.

  2. Compare and contrast the different platforms’ approaches to crowdsourcing and worker management. What are the pros and cons of each from a worker’s perspective?

  3. Consider the ethical implications of the various platforms’ models. Are workers treated and compensated fairly? Are there potential issues around data privacy, intellectual property, or the replacement of human jobs with AI?

  4. Think critically about crowdsourcing’s role in the broader labor market and economy. What types of work are best suited for crowdsourcing? What are the limitations? How might platforms evolve in the future?

  5. Suggest ways each platform could improve the worker experience and task quality based on your firsthand observations. This could include UI/UX enhancements, changes to task allocation and compensation, better communication channels, new features, etc.

Address these overarching questions in your write-up to develop an understanding of the crowdsourcing platforms and their implications for workers, businesses, and society as a whole.

Mechanical Turk

Overview: Mechanical Turk is a crowdsourcing website Requesters to hire crowdworkers to perform microtasks in return for micropayments. It is operated under Amazon Web Services, and is owned by Amazon. Employers post jobs known as Human Intelligence Tasks (HITs), named since MTurk was designed for tasks that are easy for people, but that computers are currently unable to do. HITs can be tasks like as identifying content in an image, writing product descriptions, or answering questions. Workers, also known as Turkers or crowdworkers, browse among existing jobs and complete them in exchange for a rate set by the employer.

How to Sign Up:

  1. Sign up for MTurk as a worker here. Please do this as early as possible, since it can take 48 hours or more to have your account approved. Foreign students may have problems signing up since Amazon now requires a Social Security Number. Problems have also arisen when students attempt to set up an Amazon Payments account, which is required by MTurk.

  2. Optionally (but strongly recommended), try installing a MTurk productivity tool like TurkerView or Turkopticon, or check out the ITsWorthTurkingFor subreddit.

Questions to address in your writeup:

  • How long did it take you to sign up as a worker?
  • What is your Worker ID? If you signed up for Mechanical Turk you can look up this information on your Dashboard (https://www.mturk.com/mturk/dashboard).
  • How many assignments did you complete? You can look up this information on your Dashboard (https://www.mturk.com/mturk/dashboard).
  • What was your goal when you were working? ** Make as much money as possible ** Get a high hourly wage ** Get through the assignments as quickly as possible
  • How much money did you make?
  • How much time did you spend working?
  • Do you think you could make a reasonable wage if Mechanical Turk was your sole source of income?
  • Did any of your HITs get rejected? You can find out on your Dashboard (https://www.mturk.com/mturk/dashboard).
  • If you did have a HIT rejected, what reason did the Requester give?
  • What types of tasks did you do, and how did you pick them?
  • What was the highest paying HIT that you completed?
  • What HIT paid you the highest hourly rate?
  • Describe the most interesting or weirdest or most fun HIT that you found. Write a couple of sentences describing it.
  • Did you try any of the productivity tools? What did you think?

Zooniverse

Overview: Zooniverse is a citizen-science platform in which volunteers around the world contribute to projects sourced from the professional research community. With over 1.6mm registered users, the platform supports research projects by allowing volunteers to perform tasks such as completing surveys and labeling datasets.

How to Sign Up:

  1. Go to https://www.zooniverse.org/ and click on “Register” in the top right corner of the page.
  2. Input your personal information to create your Zooniverse account.
  3. Go to https://www.zooniverse.org/projects and click on any of the available projects to reach its project overview page.
  4. Read about your selected project on its project overview page and begin contributing by clicking on “Classify” at the top of the page. Task instructions will be available for your reference.

Questions to address in your writeup:

  • What projects did you select? Why did you pick them?
  • What were the specific tasks that you had to complete?
  • Did you find the task instructions easy to follow?
  • What are examples of completed projects and publications that you see featured on the platform?
  • How much time did you spend on each project in total, and how long did its idividual tasks take?
  • All individuals on Zooniverse participate as volunteers – what do you think incentivizes individuals to contribute through this platform?

Google Crowdsource

Overview: Google Crowdsource is a platform for contributing to making the AI building blocks in Google products better. Examples of products that integrate AI at Google include Google Translate, Maps, Photos, and Assistant. In Google Crowdsource, you’ll perform small tasks (like writing a caption to an image or recognizing facial expressions). The results will be used to train AI models that help make these products better and improve their performance for a diverse population of users.

See more information about it at: https://crowdsource.google.com/about/

How to Sign Up:

  1. Go to https://crowdsource.google.com/ and log into a Google account
  2. Once logged in, make sure you are on the https://crowdsource.google.com/ main page
  3. You should see tasks like “Image Label Verification”, “Handwriting Recognition”, “Facial Expressions” that you can complete. It’s fairly straightforward to click on some of these and try out solving some of these tasks.
  4. On the achievements page (https://crowdsource.google.com/achievements), you’ll see what level you are, a summary of your activity, and “badges” you can earn by completing tasks.
  5. Please complete Level 1 for 3 badges:
  • Earn Level 1 on the Contributor badge (you achieve this by completing 100 tasks total) Choose any one of the other badges that interest you to get Level 1 on. You can click on badges to see what you need to do in order to earn them.
  • Choose any one of the other badges that interest you to get Level 1 on. You can click on badges to see a description of what you need to do in order to earn them.

Questions to address in your writeup:

  • What projects did you select? Why did you pick them?
  • What were the specific tasks that you had to complete?
  • Did you find the task instructions easy to follow?
  • How many total contributions did you make on Google Crowdsource? You can find this information on your achievements page.
  • What badges did you earn? What level are you on each badge? What were the requirements for each badge?
  • If you were designing badges for Google Crowdsource, can you think of a new badge you would create? Give it a name and a description of what it would take to earn Level 1 for your badge.
  • How did you handle ambiguous cases that you were unsure of? Did you make an educated guess or did you skip it?
  • How do you like the UI/UX design of Google’s annotation tool, does it allow you to complete your tasks efficiently? What improvements, if any, would you make?
  • Do you think Google verifies the reliability/trustworthiness of annotators like you? If you were to design a system that filters out “bad” annotators (spammers, bots, NETS 213 students rushing through to complete the assignment), what would you do?
  • Why do you think this platform doesn’t offer money as an incentive like Amazon MTurk?
  • Do you think you would contribute to this platform if it weren’t for a grade? Do you think others would?
  • Had you heard about Google Crowdsource before? Do you think it’s easy to find for most users? If you worked at Google, how might you encourage people to use the platform?
  • Why do you think Google implements a badge system instead of just showing you the total number of tasks completed?
  • Describe the most interesting or weirdest or most fun job that you were assigned. Write a couple of sentences describing it.
  • Describe the hardest or most ambiguous job that you were assigned. Write a couple sentences describing it.
  • For each of the annotation categories, try to name a Google product or service that would benefit from your work.
  • Image Label Verification, Image Caption, Handwriting Recognition, Facial Expressions, Landmarks
  • Write a review (brief, just a couple sentences) of Google Crowdsource from a user’s perspective. What did you like? What did you hate? What did you find frustrating? What would you like to see improved?

CrowdGen

Overview: CrowdGen (formerly Figure 8, formerly Crowdflower) was acquired by Appen for $300 million in 2019. The Appen Part-Time flexible jobs now offers a variety of crowd work employment options ranging from the micro tasks we have discussed to longer term projects.

How to Sign Up:

  1. To sign up go to the link https://crowdgen.com and choose ‘Earn remotely now’. From there you’ll need to make an Appen Contributor Account which takes your info and a Paypal for you to receive your payments.
  2. On the sign up there is also a Projects and a Surveys & Data Collection option but these require applying (similar to applying on MTurk) and to get a feel of the platform it’s easier to quickly sign up for the Micro Tasks platform.

Questions to address in your writeup:

  • Were you able to easily figure out the signup and two different platforms?
  • Did you understand how you find tasks to complete once you’re on the platform?
  • Did you find the task instructions easy to follow?
  • What do you think of the badge system that Appen uses to allow work to certain workers?
  • How many tasks did you complete and what was your reward?
  • Did you like the setup of the tasks?

Upwork

Overview: Upwork is a platform for a variety of different freelance tasks, it is not specific to crowdsourcing. Services that you can offer include IT, design or crowdsourcing services. Upwork goes both ways – you can either post your skills or apply to projects that are being posted by clients. The platform provides a payment guarantee for freelancers as well as a job board.

How to Sign Up:

  1. Go to https://www.upwork.com/signup/?accountType=client and register. Please select “Work as a Freelancer” during sign-up.
  2. After you’ve verified your email address, you’re being asked to fill out your profile. It’s important to do this thoroughly as Upwork is approving you based on this information. When asked about your experience level, you should select “Beginner” unless you have had some previous CS experience.
  3. Once your account is being approved, you can start searching for crowdsourcing projects by using the search function in the project section.

Questions to address in your writeup:

  • What was the project that you selected?
  • Were you able to get accepted by the client? ** If yes: What were the specific tasks that you had to complete? ** Did you find the task instructions easy to follow? ** How much time did you spend on this project? ** If you didn’t get selected: Why do you think the client picked someone else? Did you think the selection process was fair?

Mercor

Overview: Mercor is an AI-powered talent marketplace that matches job candidates with open roles. Founded in 2023 by three 21-year-old Thiel Fellows, the startup has developed software that uses large language models to vet and interview applicants, assessing their skills and fit. Candidates upload their resumes and complete a 20-minute AI video interview covering their background and a case study. Mercor’s AI then matches them to relevant opportunities, which can range from temporary contract work to full-time roles across engineering, product, design, writing and more. Employers use Mercor to quickly find and hire pre-vetted talent on flexible terms. As of September 2024, Mercor has conducted over 100,000 interviews and evaluated 300,000 candidates. The startup is growing rapidly and recently raised a $32M Series A at a $250M valuation.

How to Sign Up:

  1. Go to https://work.mercor.com/apply/ to start the job seeker application process.
  2. Provide your basic information and upload your resume.
  3. Complete Mercor’s 20-minute AI video interview, which covers your background and experience (10 min) and a relevant case study (10 min).
  4. Your application is then complete and will be automatically matched by Mercor’s AI models to open roles that fit your skills and qualifications.
  5. For certain specialized roles, you may be asked to complete an additional tailored AI interview.

Questions to address in your writeup:

  • How long did the sign up process take from start to finish? What information and documents were you required to provide?
  • Describe your experience with Mercor’s AI video interview. What kinds of questions did it ask about your background and in the case study? Did the questions seem relevant to your skills and the types of roles you’re seeking?
  • Did you feel the AI interview allowed you to fully showcase your abilities? How did it compare to a human-led interview?
  • Was the AI easy to understand and interact with? Did it seem to respond intelligently to your answers?
  • Did Mercor’s sign up process make you feel like the AI was able to get a comprehensive view of you as a candidate? Did you have opportunities to provide additional context or clarification if needed?
  • Would you be comfortable with Mercor’s AI being the primary screener and matchmaker for job opportunities? Do you trust the AI to evaluate you fairly and recommend you for the right roles?
  • What was your overall impression of Mercor’s AI-driven hiring process as a job seeker? Did it feel more or less personal than a traditional hiring process?
  • After signing up, were you matched with any relevant open roles? How well did these align with your background and career goals?
  • Did you have any concerns or hesitations about interviewing with AI? How did Mercor try to address these, if at all?
  • Would you recommend Mercor to other job seekers based on your experience with their AI-powered platform? Why or why not?

Let me know if you would like me to modify or expand the overview or questions based on the specific goals of your assignment. I can also provide examples of a good writeup once some of your students have gone through Mercor’s sign up process.

Here is the generated “Bonus Platforms” section with the requested content:

Bonus Platforms ⭐️

For students who want to go above and beyond, we encourage you to explore some additional crowdsourcing platforms that we haven’t had the opportunity to fully scope out for this assignment.

If your work is of high quality and provides valuable insights, you may earn an extra point for contributing to the improvement of this assignment for future classes, as well as special consideration if you are interested in becoming a TA for this class.

The goal is to assess whether these platforms would be a good addition to the assignment, document your experience using them, and analyze how they differ from the platforms we’ve already covered.

Here are the bonus platforms to consider:

  1. Scale AI: A fast-growing startup that specializes in labeling and categorizing data for AI/ML applications. Known for high-quality annotations and competitive pay.

  2. Fiverr: A popular freelance services marketplace that spans creative fields, digital marketing, programming, business services and more. Offers an interesting comparison to platforms focused solely on microtasks or data-related work.

  3. Mighty AI: An AI data labeling platform that was acquired by Uber in 2019. Studying its evolution and integration with Uber could provide insights into how crowdsourcing fits into larger tech organizations.

  4. Prolific: An online research platform that connects academics and businesses with participants for surveys and market research studies. Has a unique focus on scientific inquiries.

  5. Clickworker: A microtask platform that breaks larger data processing and analysis projects into smaller tasks distributed to its multinational workforce. Serves enterprise clients across several languages.

Case Studies of Different Platforms

Among the bonus platforms, Fiverr and Prolific stand out as particularly interesting case studies due to their unique characteristics and potential for meaningful comparisons with the platforms we’ve already examined.

Fiverr is distinct in that it:

  1. Covers a broad range of creative and professional services beyond microtasks or data-related work
  2. Gives freelancers more control over setting their own rates and defining their service offerings
  3. Tends to have larger-scope, longer-duration projects compared to typical microtasks

Analyzing the worker experience on Fiverr could provide valuable insights into how traditional freelance marketplaces differ from microtask-focused platforms.

Prolific is unique because it:

  1. Focuses specifically on connecting researchers with study participants for academic and scientific purposes
  2. Features more interactive, engaging tasks like surveys, experiments, and games
  3. Prioritizes ethical research practices and fair participant compensation

Exploring Prolific could shed light on the role of crowdsourcing in academic and scientific contexts and how labor practices in these settings compare to other platforms.

We encourage interested students to sign up for these platforms, complete some tasks, and share their experiences and insights. How do these platforms differ from the ones we covered in class? What are the pros and cons for workers? What unique ethical considerations arise? Your firsthand accounts and analysis could help shape the future of this assignment and deepen our collective understanding of the crowdsourcing landscape.