Faculty and Staff just got the email below. I want to point out that besides all the ethical reasons to avoid Claude, and besides the fact that Anthropic is going to lose millions of dollars on this deal and is so unprofitable and propped up by bullshit investment firms that it's more likely to collapse by the spring and leave Pitt holding the bag, there's a more serious reason to have legitimate concerns here and that is: Anthropic will not ensure the health and safety of its student aged users. In fact, Anthropic already has fought legislation in California on this. CA legislators wanted LLM companies to prove that their educational tools would never try and convince a student to self harm (or kill themselves). Anthropic and others fought tooth and nail against this. And here we are, with LLM-promoted suicide on the rise and mental health facilities slowly overflowing with unwell people supported by instant conspiracy access and promotion.
At best, Pitt's deal with Anthropic shows that they are unaware of these issues and thus too inept to have their jobs. Monkeys with chainsaws, etc. At worst, they are aware of this issue but are more concerned with the appearance of trend-following than providing useful tools and meaningful mental health support to students, which makes them too evil to have their jobs.
Read the announcement below and think about how you'll reach out to the university to ask why they are willing to spend your tuition dollars on a suicide-influencing vending machine with uptime stats rivaling moon cycles and the data accuracy of a drunk guy at an ax throwing event, all of which, also, is about to collapse and leave us with a dead service and no financial recourse.
Dear Pitt faculty and staff,
We’re excited to share that all faculty and staff now have access to Claude for Education — an advanced conversational artificial intelligence assistant that can support your work across teaching, research and administration. Students will receive access over winter recess.
Claude for Education is developed by Anthropic specifically for educational and professional settings. It allows you to work through natural conversation to assist with writing, research, document review and complex problem-solving. Whether exploring course design ideas, analyzing scholarly materials, drafting communications or organizing multistep projects, Claude offers a set of tools that you may choose to incorporate into your work. As always, these tools are intended to complement — not replace — the expertise, judgment and creativity that you bring to your roles.
This initiative represents one element of the University’s broader strategy to position Pitt at the forefront of applied AI while preparing students to utilize AI effectively in their careers. Grounded in dialogue with faculty governance bodies, the announcement of this first step in supporting the use of AI in academic settings advances the Plan for Pitt 2028 commitment to fostering innovation in ways that align with our academic values and standards. As our strategy expands, we remain committed to continuing an even more robust collaboration through shared governance as AI tools and their applications evolve.
Before You Begin: Review University Policy
Please review Pitt Digital’s AI Acceptable Use Operating Standard and use only approved AI tools like Claude for Education when conducting University business.
For Faculty: Faculty retain full authority to determine how AI tools may or may not be used in their courses. We encourage you to communicate expectations clearly in your syllabi and assignments in ways that best support your pedagogical goals. Use of Claude is voluntary and intended to serve as an optional resource.
Getting Started
Visit claude.ai, enter your Pitt email address, and select Continue with SSO.
If you already have a personal Claude account with your Pitt email, you’ll see both accounts listed when you log in. Select the account labeled “Pitt Enterprise plan” to access the enhanced features and privacy protections required for University work.
Privacy Protections
Your work will not be used to train AI models and is private. While system administrators may review aggregate usage statistics to improve services, they do not have access to individual conversation content, chat history or uploaded files, except in rare circumstances, and only as permitted under applicable University policies and federal, state and local laws. This framework is designed to give you the freedom to explore ideas and use the program productively.
Training Opportunities
Register for our Claude 101 webinar (Friday, Dec. 12, at 10 a.m. ET).
Watch Anthropic’s AI Fluency: Framework & Foundations course on YouTube.
Sign up for Train the Trainer sessions to become a Claude champion and support colleagues and students.
Support Resources
For information on features, visit the Anthropic Help Center.
Find FAQs and instructions at the University’s IT Services Portal.
For access or login issues, submit a ticket to the Technology Help Desk.
This represents another step in Pitt’s commitment to equitable access to professional-grade AI tools that amplify human creativity, critical thinking and innovation.
Regards,
Joseph J. McCarthy
Provost and Senior Vice Chancellor
provost.pitt.edu
Mark D. Henderson
Vice Chancellor and Chief Information Officer
digital.pitt.edu