If I started explaining differential equations to you right now, you would be lost unless you already understood algebra, geometry, trigonometry and calculus. That’s because there’s a shared language we use in math to explain and communicate about more complex concepts.
Could someone get an LLM to explain it to them? Sure! Although, when answering questions an LLM isn’t going to give you the right answer. It’s going to give you the statistically probable answer. Which means it is not going to do the math. It is going to query its knowledgebase and infer the answer based on all the information it has gathered, including all the wrong answers caught in its net. It cannot tell the difference between the two.
Of course to fully explain how LLMs work, the listener needs to understand statistical math, which again requires linear algebra and calculus. (We can skip some of the geometry and trig this time.)
There is really no such thing as leveling the playing field when it comes to knowledge. Just like wisdom, it takes time and mistakes to earn.
This is a nice thought, but as we discussed in this thread here: Study Finds Learning via ChatGPT Leads to Shallower Knowledge, having an AI regurgitate things to you does not help you learn the knowledge as well as actually learning the knowledge. I would be more worried about it increasing the occurrence of the Dunnig-Kruger effect when it comes to specialized knowledge.
Are you suggesting it’s time to stop making academic material, and researching new things?
There are many reasons to to use tools. Some people use Godot because they do not know how to program a game engine. Some people use it because they don’t want to program a game engine. Some people use it because they want to make a game to sell. Some people do it for fun. Some people do it to learn.
But I don’t know if I agree with the “so much better” argument. Personally, I use Godot because it’s good enough and easy to use. I enjoy it more than Unity and Unreal. But some would argue that Unreal or Unity would be the way to go if you wanted something “so much better”.
Yeah…but people who write papers are writing them for people who understand the math/science/etc.
Previously inaccessible to them? Perhaps. But like not everyone is suddenly going to become a neurosurgeon. Because not everyone wants to.
Are you suggesting all information should be free and available to all? It’s a nice idea.
It is. But that, in itself has issues. It isn’t reading it and understanding it. It’s running it through a statistical algorithm. For creating cliff notes for a book report, it’s probably ok. For giving a basic overview of a complex subject, it could be helpful. But relying on an LLM to not miss salient points is assuming that it understands the salient points and can explain them. It excels at explaining things we already know. The less knowledge available on a subject - the more likely it is to make things up that are not true. Case in point - asking for help programming Godot. (It’s only been a week since the last post I saw on here with someone asking for help with the crap code an LLM made for them that didn’t work as advertised.)
As someone who worked in biomedical research, I gotta say that a brief overview is helpful for business meetings, but not for actually contributing to further research. I worked on the tech side and I can tell you that I just found it boring after digging deep enough. That’s why we had Subject Matter Experts (SMEs).
TBH, I’m not really seeing how one draws the line between biomedical research and common sense. Common sense to me is like eat healthy and exercise.
The problem with the tech, as mentioned in the video that @mrcook posted is that Moore’s Law no longer holds up. That means that future processors are going to drain more power, and the technology will slow - at least until quantum processors become much more common (and stable) The problem with the algorithms is they write themselves. No one is sitting there coding them. Because the loss in accuracy is worthwhile compared to the speed and depth we get from LLMs.
I appreciate that you are optimistic about LLMs. I think that’s a good attitude to have. I also believe it’s important to understand the realities and drawbacks of technology even as we push it.