2025 Will Widen the Digital Divide

Plus, our new custom GPT course is now available.

[image created with Dall-E 3 via ChatGPT Plus]

Welcome to AutomatED: the newsletter on how to teach better with tech.

In each edition, I share what I have learned — and am learning — about AI and tech in the university classroom. What works, what doesn't, and why.

Today, I explain why I think 2025 is a key moment for many educational institutions and educators, at least with respect to the growing AI digital divide.

I also provide some AutomatED updates, including the release of our new custom GPT course and the release of my AI-powered department course scheduler (email me if interested; more information on it at the bottom of this message).

💡 Idea of the Week:
Don’t Let Your Colleagues Wait

This week, I will be speaking about AI at the WONDER conference hosted by NOVA and then at Lake Michigan College.

The following week, I return to Michigan to talk with faculty at Southwestern Michigan College and then go a little south to discuss AI pedagogy with graduate students at Notre Dame.

The commonality across these diverse audiences is that they are all working hard to stay ahead of the AI curve.

This is not just wise — it is essential. AI is yet another source of digital divides, and those institutions, departments, faculty, and staff who aren’t working hard on how it can be useful for them are going to fall behind.

I think 2025 is a turning point; it is the year where the AI digital divide becomes stark.

Consider two recent developmenets that illustrate why:

Google just launched "Deep Research," an AI research assistant in Gemini Advanced that autonomously explores complex topics and compiles comprehensive reports with source links. As Google describes it (here’s also a video that conveys how it works):

"Over the course of a few minutes, Gemini continuously refines its analysis, browsing the web the way you do: searching, finding interesting pieces of information and then starting a new search based on what it's learned. It repeats this process multiple times and, once complete, generates a comprehensive report of the key findings."

Meanwhile, OpenAI just announced o3, their next frontier model (‘o2’ is being skipped because it’s copyrighted). Their internal testing shows it outperforming PhD-level experts on scientific reasoning tasks and beating all of their programmers (except one?) on coding benchmarks. In fact, o3 can solve novel mathematical problems that would "take professional mathematicians hours or even days to solve." Here’s their YouTube video with more:

On the basis of this progress, OpenAI's CEO Sam Altman stated in a blog post released yesterday that the company is now “confident we know how to build AGI” [artificial general intelligence] and predicting that in 2025, “we may see the first AI agents join the workforce and materially change the output of companies.”

Many faculty I talk to about developments like these are already behind. They don’t know basic facts about its functionalities, aren’t working to integrate it with their workflows, data, and field-specific tasks, and aren’t teaching their students how to use it, despite their students’ interest in it (for evidence of the latter, see here, here, here, and here).

Those of you reading this newsletter are likely among the minority of faculty who see what's coming, even if you differ with me and each other about the details. You're in what I observe to be the smallest of four groups:

  1. Those who are already there; power users or informed skeptics who have thought deeply about AI's implications

  2. Those who wish they knew more and are actively working to learn and adapt

  3. Those who wish they knew more but haven't made the time

  4. Those who actively resist engagement, often virtue signaling about AI's risks while remaining willfully ignorant of its functionality and its impacts

2025 will be defined by two major trends that make your role as an early adopter crucial for your department and institution:

First, agential AI — AI that can guide itself with more autonomy — will become increasingly capable. Tools like Google's Deep Research aren't just research assistants; they're precursors to AI systems that can actively pursue complex academic tasks with minimal oversight. This isn't just Google's goal — it's an express aim of many major AI developers.

Second, AI will have increasing access to context-sensitive data. For those of us already working to connect AI with such data (like my department course scheduler project), we know its capabilities are remarkable when properly integrated with domain information or field-specific knowledge. From small projects like mine to massive initiatives at Apple and Microsoft, the integration of AI with institutional data and workflows is accelerating.

These trends mean that departments and programs that delay restructuring their curriculum and pedagogy around AI will quickly fall behind those that don't. The gap won't just be in efficiency — it will be in the fundamental quality of education they can provide. Students need to learn not just their field's content, but how to leverage AI within it.

This means you, as an early adopter in your department, have a crucial role to play in 2025. Your institution needs more than just AI policies, vague statements from the provost, and monthly reading groups (which, while well-intentioned, don't move any meaningful needles). It needs concrete plans for instructional integration, program restructuring, and faculty training.

Without them, you’ll end up on the wrong side of the latest digital divide.

Stay tuned as I report back on whether I am right.

Next week, I give my take on how well Deep Research works in its current form. (And if you have any tough problems for me to test o1-pro with, email me.)

🤖 New Custom GPT Course
& Other Updates

2025 brings many changes to AutomatED…

Weekly Newsletters Again but Shorter

I know that my newsletters are often too long and you’re always too busy. In 2025, I’ll be working hard to write more concisely.

But this brings a new problem: there is too much going on in AI land, and sometimes a deep dive is necessary for a tricky ethical, pedagogical, or technical issue.

Solution: I keep you updated with weekly newsletters, and you can get ✨Premium or come to the monthly webinars if you want deep dives.

Custom GPT Course

The course is now live on LearnWorlds (our e-learning host), and you can access it via this link:

I think you’ll like it a lot. Beta feedback has been very positive so far!

If you are a ✨Premium subscriber, your discount code for 20% off is immediately below:

✨Premium Improvements

2025 brings some big improvements to Premium subscriptions — while the price remains the same ($99 per year):

  • Free Access to Monthly Webinars

  • Discounts on Courses

  • Premium Sections in Weekly (Free) Newsletters

    • After this week, there will be a section in every Monday’s (free) newsletter that is only for Premium subscribers. I’ll include polls to help determine what information Premium subscribers want me to provide there, but it will definitely include prompting techniques, commentary and predictions, and insights into more advanced uses of AI.

  • Free 1-Hour Consultation with Me Each Year

    • In addition to my presentations, I consult with dozens of professors each semester about their tech-related pedagogical questions/challenges, and I build custom AI tools for them, too. As a Premium subscriber, you’ll have access to an hourlong free consult each year where we can discuss anything you’d like.

  • No Sponsor Advertisements Shown to You

This is in addition to all pre-existing ✨Premium benefits (all 17 Guides and Tutorials accessible here):

  • Comprehensive Pedagogy Guides

  • AI Use Case Tutorials

  • AutomatED AI Tools

  • Two New Premium Pieces Per Month

    • I’ve got a bit of a backlog right now, but at least the following 6 pieces will be sent in January and February:

      • Tutorial: Anonymize Canvas Submissions in Microsoft 365

      • Tutorial: How Profs Can Use Copilot in Microsoft 365

      • Tutorial: How Profs Can Use Gemini in Workspace

      • Tutorial: How to Use Google’s LearnLM

      • Tutorial: Zapier for Professors and Learning Specialists

      • Tutorial: How to Teach With Google’s Deep Research

If this all sounds good to you, you can upgrade here:

January’s Webinar

As I mentioned at the end of last year, I am going to turn to more advanced topics more generally in 2025, in my ✨Premium content and in my paid webinars. Free newsletters and free webinars will stay more focused on the basics and keeping everyone up to speed.

The next webinar will be paid but free for ✨Premium subscribers. Please let me know via the below poll what you’d like it to cover (after you make a selection, you will be redirected to the website where you can also offer typed comments, if you want):

What would you like the next (paid) AutomatED webinar to cover?

Only select what you'd be willing to pay for...

Login or Subscribe to participate in polls.

If I were to do one on custom GPTs, Gems, and Claude Projects, I would focus on topics about custom GPTs, Gems, Projects, etc. that are not covered in my pre-existing Tutorial or the new course. Topics like: API connections, workflow optimization, when to pivot to Zapier and LLMs’ APIs, etc.

📬 From Our Partners:
Custom GPT RAG Issues?

Writer RAG tool: build production-ready RAG apps in minutes

RAG in just a few lines of code? We’ve launched a predefined RAG tool on our developer platform, making it easy to bring your data into a Knowledge Graph and interact with it with AI. With a single API call, writer LLMs will intelligently call the RAG tool to chat with your data.

Integrated into Writer’s full-stack platform, it eliminates the need for complex vendor RAG setups, making it quick to build scalable, highly accurate AI workflows just by passing a graph ID of your data as a parameter to your RAG tool.

🧪 AI Department Course Scheduler:
It’s Here

After 3 months of development, my department course scheduling AI tool is now ready.

The system can handle faculty-to-course assignments, course-to-room scheduling, or both, working directly with your existing spreadsheets and faculty preference data — no format changes needed and no proprietary forms.

Given the costs of developing the tool, running it, maintaining it, and ensuring user satisfaction with its outputs, I am now treating it as a “product” of AutomatED that I’ll offer at a (very low, relative to the market) price.

For the January-March scheduling period, I have about 20 remaining slots available.

I’m offering “beta” participants the following:

  • Free initial compatibility check of your existing data (I prove to you that it can do your upcoming semester by doing your prior semester)

  • Promised 5-day turnaround on optimized schedules

  • Multiple revision cycles for adjustments (you’d be surprised how many contradictions, ambiguities, lacunae, etc. exist in the average dataset I encounter)

  • Early access pricing

Interested?

The first step, in most cases, is to email me a sample of your past scheduling materials (Fall 2024 would be ideal) for a compatibility assessment.

But feel free to reach out to discuss first, if you’d like…

Note: I hope to have a self-supported web interface live in February. For now, beta interactions are via email with full support.

What'd you think of today's newsletter?

Login or Subscribe to participate in polls.

Graham

Let's transform learning together.

If you would like to consult with me or have me present to your team, discussing options is the first step:

Feel free to connect on LinkedIN, too!