Wharton@Work April 2026 | Nano Tools | Innovation AI Adoption Is a Challenge. Here’s a Solution. Nano Tools for Leaders® are fast, effective leadership tools that you can learn and start using in less than 15 minutes — with the potential to significantly impact your success as a leader and the engagement and productivity of the people you lead. Goal Help leaders reduce resistance to generative AI by addressing employees’ core psychological needs — competence, autonomy, and relatedness — through the AWARE framework. Nano Tool Organizations are investing heavily in generative AI, but adoption is lagging and resistance is rising. Recent cross-industry research shows that 31 percent of U.S. knowledge workers admit to actively working against their company’s AI initiatives, and 41 percent of Gen Z workers report the same. Meanwhile, while 85 percent of leaders and 78 percent of managers regularly use gen AI, only 51 percent of workers do. More than half of employees say they would use AI tools without formal approval, and nearly one-third keep their use hidden from employers. This adoption gap erodes productivity gains, weakens trust, and delays return on investment. But it’s not simply a training deficit. It is a psychological one. Research on workplace motivation consistently shows that employees thrive when three core needs are satisfied: feeling capable and effective (competence), feeling in control of their work (autonomy), and feeling connected and respected (relatedness). Gen AI can strengthen these needs by expanding skills and reducing drudgery, but it can also threaten them by redefining expertise, mandating rigid workflows, or disrupting collaboration. When those needs are frustrated, resistance is predictable. The AWARE framework offers leaders a disciplined way to address the human side of AI integration by acknowledging concerns, monitoring coping behaviors, aligning support, redesigning work for human-AI complementarity, and empowering employees through transparency and participation. Leaders who apply it treat AI implementation not as a technical rollout, but as an organizational transition — one that determines whether AI becomes a productivity accelerator or a source of division and disengagement. Action Steps 1. Acknowledge psychological impact: Surface concerns instead of suppressing them. Openly recognize how AI may affect identity, skills, and job security. Name the competence threat (“This may feel like it’s redefining what expertise means”), address autonomy concerns (“We don’t want this to feel imposed”), and validate relatedness anxieties (“This changes how we collaborate”). Acknowledgment builds psychological safety and reduces quiet resistance. 2. Watch coping behaviors: Pay attention to how employees respond — both adaptively and maladaptively. Adaptive behaviors include skill building, workflow experimentation, and peer collaboration. Maladaptive behaviors include withdrawal, avoiding AI-related tasks, “shadow AI” use without disclosure, and passive resistance or open opposition. Monitoring usage patterns and listening carefully allows leaders to intervene early and empathetically. 3. Align support systems: Training alone is insufficient. Support must align with psychological needs. Build competence through hands-on experimentation and role-specific learning; preserve autonomy with flexible, personalized learning pathways; and strengthen relatedness through peer coaching and collaborative forums. Avoid one-size-fits-all training; instead, tailor development journeys to skill level and readiness. 4. Redesign roles for complementarity: Don’t simply “plug AI into” existing workflows. Redesign work to balance automation and augmentation by assigning repetitive, data-heavy tasks to AI; elevating human tasks requiring judgment, empathy, creativity, and ethics; and redefining roles to increase ownership and strategic contribution. End-to-end workflow redesign fosters engagement more effectively than tool deployment alone. 5. Empower through transparency and participation: Empowerment requires more than access — it requires voice. Communicate clearly about what AI will and will not change, involve employees in identifying high-value use cases, and provide inclusive access to tools and training. When workers help shape implementation, they become co-creators rather than reluctant adopters. How Leaders and Organizations Use It The following examples illustrate how acknowledgement of AI adoption as a leadership rather than a technical challenge redesigns work for the future while protecting competence, preserving autonomy, and strengthening connection: PwC’s “My AI” initiative combines tools, hands-on experimentation (“prompting parties”), and peer “activators” embedded across the firm. The approach builds competence through practice, preserves autonomy through experimentation, and strengthens relatedness through social learning. Moderna merged technology and HR into a unified People and Digital Technology function to redesign AI-enabled workflows collaboratively. Dell simplified sales processes before introducing AI tools, freeing teams for higher-value customer work. Both focused on human-AI complementarity rather than plug-and-play deployment. BNY broadened access to AI tools across the workforce, enabling thousands of employees to build their own agents. Companies such as Colgate-Palmolive and Johnson & Johnson involve employees directly in identifying AI use cases. Participation fosters ownership and reduces resistance. Knowledge in Action: Related Executive Education Programs Generative AI and Business Transformation Analytics for Strategic Growth: AI, Smart Data, and Customer Insights Strategies for Accountable AI AI in Marketing: Creating Customer Value in an AI-Driven Enterprise Contributors to this Nano Tool Erik Hermann, interim professor of marketing, European University Viadrina, Germany, is a researcher focused on the psychological and behavioral effects of emerging technologies in organizations. Stefano Puntoni is the Sebastian S. Kresge Professor of Marketing at the Wharton School and co-director of Wharton Human-AI Research. His research examines how artificial intelligence reshapes decision making, consumer behavior, and the future of work. Carey K. Morewedge is a professor of marketing at Boston University’s Questrom School of Business. His research explores judgment, decision making, and the psychological drivers of behavior in organizations and markets. This Nano Tool is adapted from their research and article in the Harvard Business Review. About Nano Tools Nano Tools for Leaders® was conceived and developed by Deb Giffen, MCC, Director of Innovative Learning Solutions at Wharton Executive Education. It is jointly sponsored by Wharton Executive Education and Wharton's Center for Leadership and Change Management, Michael Useem, Director. Nano Tools Academic Director is Professor John Paul MacDuffie, Professor of Management at the Wharton School and Director of the Program on Vehicle and Mobility Innovation (PVMI) at Wharton's Mack Institute for Innovation Management. Download this Nano Tool as a PDF Share This Subscribe to the Wharton@Work RSS Feed