i read “the age of spiritual machines” by ray kurzweil in middle school—when no one around me was talking about ai.
by freshman year of high school, i was deep into big data, process mining, and ai.
one of the pioneers in process mining even reached out to collaborate (can’t remember why, but it happened).
ask anyone i spoke to back then, and they’d probably say the same thing: i wouldn’t shut up about building the largest consumer ai & data epicenter in africa.
today, a founder friend of mine is on that journey. his company? worth over $100M.
but enough yapping.
my point is that i’ve been a silent yet quick, avid yet cautious adopter of ai.
these days, you can barely go a day without either reading up on something labeled ai, listening to ai used multiple times, or just using the subsegment of a subsegment of a subsegment of ai that’s made widely available to the masses- the deep learning artificial neural network generative pre-trained transformer large language model. if you don’t know what i’m talking about by now, it’s Chat GPT.
yes, it’s a subsegment of a subsegment of a subsegment of another subsegment of ai.
as the next few months come by, more terms are gonna be thrown around.
my prediction is that with ai, there’ll be knows and knows-not.
think of it like coding in the early 2000s—some people built the internet, others just browsed it.
ai is creating a similar gap.
you either understand how it works or you just use the outputs.
you either know this thing or you don’t.
the rise of dichotomy as we would come to know. the fall of democracy as we used to know.
no doubt, i can’t tell you all there’s to know. but i wanted to at least share some very basic knowledge of ai.
such that next time you’re in a conversation an someone mentions deep learning artificial neural network generative pre-trained transformer large language model, you have some inkling that the GPT in “Chat GPT” stands for a generative pre-trained transfomer”
so here goes it.
ai- is the broad umbrella.
just one level is “machine learning or ML”
ML is divided into 4 (so far)
supervised learning: you feed labeled data into a model and it gives you good output. what you give is what you get.
unsupervised learning: you guessed it. you feed un-labeled data into the model.
reinforcement learning: you reward or punish the model based on progress to reinforce its learning
deep learning: is the training of artificial neural networks based on a large amount of data sets. you can generally categorize deep learning into 4 areas.
feedforward neural networks
convolutional neural networks: processes images
recurrent neural networks: processes speeches and texts
transformers: large language models (llm), small language models, multimodal models.
so if we go a step further for transformers, we have
large language models: as the name implies, models that are very huge in terms of datasets, and expensive training.
the most popular example of an llm is drumrollllll chat-gpt.
GPT is a generative pre-trained transformer. this means it’s been pre-trained.
when you think about it, Chat-GPT in simple terms means you’re chatting with a generative pre-trained model.
if none of what i said makes sense, at least i hope this one does.
chat-gpt is a large language model that was pre-trained on artificial neural networks based on large amounts of data sets.
chat-gpt is a type of large language model
large language models is a type of transformer
transformer is a type of deep learning
deep learning is a type of machine learning
machine learning is a type of ai.
AI → ML → DL → Transformers → LLM → ChatGPT
between the first word (ai) and the last word (chat-gpt), there are 4 layers.
if you only focus on the word “ai” in different conversations, you’re several layers separated away from the children+grandchildren+gg grandchildren.
if you only focus on the word “gpt” in conversations, you’re several layers separated from the grandparents+great grandparents+ gg grandparents.
maybe none of these make sense; maybe some of these make some sense or maybe all of these make some sense.
regardless, never stop learning. cause these models aren’t stopping.
i created a recap of everything below.
so when you’re listening to talks or podcasts about ai, pay close attention to which area of ai- is it llms, or transformers, or just deep learning?
ttyl,
dulra <3
Ggparents to me is something interesting and we should be on the edge cause more are coming.
Lovely! I liked the 100million dollar yapping as well!