ChatGPT first hit the headlines late in 2022 – and AI and its generative capabilities has exploded into public consciousness ever since. Every hour, web and mobile enabled apps are being developed and released and they are capturing the public’s interest like never before.
Many of MinterEllisonRuddWatts’ clients are racing to use, or develop, generative AI tools like ChatGPT to gain efficiencies and grow their business. But to fully realise the transformative power of this ever-changing technology, we need to grapple with the underlying process upon which it was built.
In part one of this two-part episode, Partner Tom Maasland takes generative AI back to basics with Matt Ensor, Co-founder and CEO of Frankly AI and Business Lead, Transport at BECA.
Matt explains the core terminology (and technology) underlying ChatGPT including [03:12] generative AI and how if differs to AI; [04:37] machine learning and deep learning; [06:55] large language models (or LLMs) and [09:52] prompts.
This is a must-listen episode for anyone seeking to better-understand ChatGPT, generative AI, or both. For an easy-to-use glossary of the terms discussed in this episode, click here.
Please get in touch to receive an episode transcript, and don’t forget to rate, review or follow the Tech Suite wherever you get your podcasts. You can also sign up to receive technology updates via your inbox here.
About our guest
Matt Ensor is passionate about the intersection of technology and society.
Matt is the Business Lead, Transport at BECA where his focus is the use of AI to accelerate business. He is also Chair of the Large Language Models working group as part of AI Forum NZ, where he leads the collaborative development of a White Paper for Aotearoa New Zealand on LLMs.
In 2020 Matt launched Frankly AI, a user-led, conversational AI tool that connects organisations with staff, stakeholders and indigenous communities.
Read more of our related insights.View all insights