Work in the United States may refer to:
- Economy of the United States
- Labor unions in the United States
- United States labor law – US laws on fair pay and conditions, unions, democracy, equality and security at work
- Work–family balance in the United States
- Work–life balance in the United States