No-GPT learning, freelancers meet on 14th August 2024

Posted by

On the 14th August, 15 freelancers met in the beer garden of the Battle of Trafalgar pub in Brighton to chat all things self employment and tech.

This is some of what we talked about:

  • The Eisenhower Matrix for prioritising tasks
  • Learning React
  • Trying to break into tech
  • Farm history
  • Coworking in Thailand
  • Chiang Mai
  • SEO and Analytics
  • Uses for AI tools
  • No-GPT development – learning without AI to help really learn a skill
  • How to do Search Engine Optimisation in 2024
  • Why use Google Ads? Why do SEO when Google Ads exists?
  • Bialowieza forest in Poland
  • Brighton SEO conferences
  • Drinking in Brighton
  • Mystery shoulder injury
  • Excluding PDFs from Google search
  • Google Suite admin frustrations

Highlights

No-GPT learning

Sul is learning React as part of trying to break into the tech industry (if you’re looking for a junior for your tech company, get in touch with him Sul is smart, hard working, and just needs a chance to show how useful he could be for your company.)

We were talking about AI tools with Christopher, who offers SEO as one of his services, and looped back to how Sul is learning. “No-GPT” development came up as a concept – people being happy using no AI tools as part of their development process, and Sul is using “No-GPT” in his learning too. He found being able to get quick answers from Chat-GPT and similar tools meant the solution never really sank in, even though he was reading the code to try to understand it.

By researching solutions through search and some trial and error in using the commands it gave him, he’s finding he’s retaining the knowledge much better. It seems slower, but actually it’s quicker as he’s actually learning, rather than looking up the same thing time and again.

There is growing skepticism about the utility of AI (that is, Large Language Model based AI) for many of the areas it is being sold into currently, and I see this adding to the arguments around that. Does learning with a thing that gives you chunks of answers actually benefit you? If someone teaching a skill was giving you lumps of information you could use without really understand them, would we find that acceptable, or suggest they should teach in a way that makes their students learn the basics of the information better? It’s going to be interesting to see how LLMs actually end up being used in 2-5 years time when the hype has died down.