When ChatGPT launched in November 2022, it became clear that Large Language Models (‘LLMs’) will cause huge disruption. Now Bing has incorporated GPT-4 and Google has semi-launched Bard, it’s clear that disruption will include SEO.
It’s impossible to predict how this will play out, maybe:
- The web will be flooded with mediocre LLM content and search engines will aggessivley filter it
- Bing will grab huge market share
- Google won’t drive traffic to the sites it scrapes for it’s index / corpus and the search ecosystem breaks
- Searchers won’t like LLM or chat as a UI, and it’s just another productivity tool
🤷
Whatever happens it’s going to be imporant to understand the technology.
Books 📚
Make Your Own Neural Network: Learn how to build the kind of neural network that underlie the ChatGPT, Bard etc from scratch. Requires basic Python knowlege, but is very clearly written.
Key papers 📝
Attention Is All You Need (2017): Introduced the transformer archtecture that all recent LLM progress has been buit on.
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (2019): Seminal paper from Google introducing bidirectional transformers. Useful for SEOs as it has been used by Google to improve query understanding.
Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks (2020): Introduced using a neural network to retreive data from external sources e.g. a search engine index. Very likley how Bing is providing sources for it’s GPT-4 integration.
TALM: Tool Augmented Language Models (2022): Adding ‘pluginss - capabilities beyond language to LLMs.
GPT-4 Technical Report (2023): Describes the development of GPT-4, as used by Bing.
Podcasts 🎙️
Courses 🎓
Machine Learning Specialization by Andrew Ng
Newsletters 🗞️
Import AI: The latest AI news from industry and academia. Comes with a free short story inspired by AI, which is very cool.
TLDR > AI A daily email summary of AI launches, news, papers etc.
Social 😄
Front page of Hacker News: I’m going to waste time on social media anyway, so I may as well learn something.
Header image generated by Dalle-2.