TrackerFF 2 hours ago

I'm guilty here. For years I've been meaning to learn modern webdev, but every time I've sat down to read the docs, tutorials, books, and what have you - I just give up after a couple of hours. Getting seemingly easy stuff done is just a drag.

The other day I decide to try ChatGPT 4o with canvas. For a solid year, I've planned to create some easy membership registration and booking system for this small club I'm part of - just simple stuff to book rooms in a building.

Well, to my absolute amazement - I had a working product up and running after 4 hours of working with ChatGPT. One block at a time, one function at a time. After a day I had built on a bunch of functionality.

So while I'm not completely clueless on back-end programming, my front-end skills are solidly beginner. But it felt like a breeze working with ChatGPT. I think I manually modified at tops 10 lines during all this, everything else was just copy/paste and upload source files to ChatGPT.

Any errors I'd get, I'd either copy/paste, or provide a screenshot.

I actually tried doing something similar when GPT3.5 came out almost two years ago, but it was just too cumbersome then. What I experienced the other day felt lightyears beyond that.

So, did I learn anything ? No - not really. But did it solve a problem for me? yes.

EDIT: But I will add, it did provide solid explanations to any questions I had. Dunno how well it would have worked if my 70 year old mom had tried the same thing, but a gamechanger for people like me.

  • swatcoder an hour ago

    > So, did I learn anything ? No - not really. But did it solve a problem for me? yes.

    And this is exactly the concern.

    The tools are genuinely useful for some tasks. But unlike club organizers getting to DIY some hobby project for their club, students aren't yet being tasked to produce useful things in the best way possible. They're being tasked to do fairly rudimentary things so that they can learn some fundamentals by way of practice.

    And likewise, in trades like ours, juniors are tasked to do useful things, but they're given affordance to deliver those things in ways that help them learn some fundamentals by way of practice.

    Students and juniors who skip the practice are basically just trading their future expertise and readiness to accomplish trivial things that either don't or barely matter. Some of them may become the first generation of expert prompt engineers, accomplishing things in totally new ways in what amounts to a novel trade, but many of them are just going to be shooting themselves in the foot.

  • aimazon 2 hours ago

    The point missed here is that you didn’t need to write any code at all, with or without ChatGPT. ChatGPT helped you with busy work: you reinvented something that already exists, instead of using a mature and established membership platform, you built your own. The reason this has parallels to education is because that’s what education traditionally is: busy work.

    You did learn something, by the way: you learned how to use modern tools. You didn’t do things most efficiently but it was more efficient than writing code without the help of ChatGPT.

    • stackghost an hour ago

      >The reason this has parallels to education is because that’s what education traditionally is: busy work.

      Busy work is work that is assigned merely for the purpose of occupying one's time.

      That's not the same thing as practice. We drill children in arithmetic not to keep them busy but because it turns out repeatedly solving multiplication problems is an effective way to teach children their times tables.

      • hbosch an hour ago

        >That's not the same thing as practice.

        Exactly right. In terms of education, there generally seems to be a blurry line between was is considered learning and what is considered memorization. If you memorize your times tables, it doesn't mean you've learned multiplication for example... oftentimes the ability to memorize and recall things is opposed to learning, which means leveraging previous knowledge to solve something new.

        In the case of AI, it usually presents facts and opinions simultaneously (something a calculator famously does not do, for example). Facts are memorized, opinions are learned. In all core studies it's always been more important to understand what you're solving for, and why, rather than "how" to solve it. The continued dissolution of the "how" barrier is a net benefit for all of civilization, and when experts of "why" are valued more than experts of "how" the world will be a much better place.

        • nunez 5 minutes ago

          This is one reason why many educators are phasing out homework. It's great practice but can easily lead students to regurgitating information instead of understanding and retaining knowledge. This is also why quizzes and tests are vital in a well-designed curriculum: they test understanding (or at least are supposed to).

      • ghostpepper an hour ago

        This may be why the practice was invented but I bet there are plenty of teachers who see it more as a way to keep them busy

    • pclmulqdq an hour ago

      When you are learning something, that busy work helps. What you think of as busy work when you are a professional is actually often sort of novel to learners and is a simple example of how to do stuff.

    • JTyQZSnP3cQGa8B an hour ago

      > you reinvented something that already exists

      Since AI can’t invent new stuff, who will do that? Juniors who haven’t learned anything because of those tools? Or seniors who will disappear one day because they are retiring or are being replaced by AIs?

      I already work with juniors who use ChatGPT and cannot explain what they wrote. They have a fucking engineers degree and don’t know anything. It’s catastrophic and may increase in the future. What will happen if it continues like this?

      • aimazon 44 minutes ago

        code is an input not an output. People don’t care about code, they care about products. You can build something new using code that already exists: every product we use today is built on a lot of what came before.

        My point wasn’t that writing code with AI is bad, my point was that writing code for the sake of writing code is bad. If something already exists, use it. If something doesn’t exist, build it, bring something new to the world — whether that’s with hand-typed code or ChatGPT assisted code, I don’t care.

        I think we should write less code.

        • jpc0 26 minutes ago

          > You can build something new using code that already exists: every product we use today is built on a lot of what came before.

          I don't disagree with this from a business perspective but for an engineers perspective I find it severely limiting.

          Even very very basic things should probably stay fresh foe you. If you cannot implement a basic parser ( recursive dexent / pratt etc) you will very likely reach for regex when there is likely a better solution that isn't a lot of code.

          You should probably know how to write leftpad... Or how to strip ascii whitespace using an ArrayBuffer and a for loop in JS. These are things that is extremely easy but a little tedius to do but are fundamental skill to building up more complex solutions later.

          You should probably know how to build and reason about some more advanced datastructures in your language. Basic trees, directed graphs, trie. These are things that if they are second nature for you to implement you can come up with novel solutions to actually novel problems when they come up.

          You also get an innate understanding of where the performance characteristics of certian algorithms and datastructures actually lay. Because big O doesn't always tell the full story...

          • nprateem 22 minutes ago

            And yet in 20 years of coding I've never needed to write any of these. Even implementing a graph is something I've only needed once or twice.

            It's far more important to know what you want to do rather than how to do it.

        • nprateem 24 minutes ago

          Yes, but there's a balance. The HN purists who are wedded to their knowledge struggle with this, but then someone does need debugging skills to go in and fix things when some of the stupid things AI does makes things break.

          Also I've found telling it specifically where it's messed up is way more effective than just shouting at it to fix it after it's failed a second time. And sometimes you just need to manually fix it.

          I wrote an entire library last weekend, then rewrote it on Monday when I realised I'd messed up. Two things I wouldn't have bothered to do without AI doing the coding.

          I know how the important stuff works and I could pick my way through the JS, but glad I didn't have to write it. I mean, I just wouldn't have.

    • whimsicalism an hour ago

      I love that AI negativity on HN is so strong that we reclassify whatever work AI can do into “busy work” as soon as it is possible.

      • aimazon an hour ago

        That’s not my point. I’m not disparaging AI. I described AI as modern tooling that is beneficial to learn. I’m sure there are many professional developers saving time using AI to generate code they would have otherwise written. My point is that in this specific case, AI didn’t enable anything useful. I would have said exactly the same if the OP had written the code without AI. If a problem is long solved, reinventing it is busy work. Busy work can be fun, I reinvent things all the time, but that doesn’t change the nature of it. If the project they had built had been something novel (that does not exist) then it would have not been busy work.

        I was shit talking education if anything :)

  • tempodox an hour ago

    You happened to use an LLM for something that is most prominent in its training data. Do something off its beaten path and correcting all the hallucinations will be more work than just plain old learning it.

    • ants_everywhere 27 minutes ago

      This is my experience too. Even with pretty common problems in popular languages like Python. The code generated by ChatGPT 4o is full of bugs. If you give it feedback it just thrashes instead of trying to locate and correct the underlying problems.

      Even if you ask it to think about and correct the underlying problems, it still generates buggy code, often with the same problems it was pretty decent at reasoning about.

    • whimsicalism an hour ago

      not true in my experience - of course you have to work within its capabilities but i find it to be a capable partner across large segments of tasks

  • illwrks 4 minutes ago

    Not to put a downer but have you also perhaps created a liability for your club? If you don’t fully grasp what is going on with the code have you potentially left the door open to exploits?

  • treflop an hour ago

    If it’s not your daily job, I don’t see the harm. Sometimes I use ChatGPT for something I absolutely don’t care about.

    But if it is, then I think you’re trading it for a career of trial and error.

    Regularly I watch people at work spend a week trying to solve a problem, but because I learned the fundamentals at some point in my past life, I am able to break down the problem, identify the root cause, and solve it within hour.

  • hammock 44 minutes ago

    The whole thing reminds me of how we used to check out BASIC books from the library and manually type in complete programs, games etc from the book and run them. I wouldn’t say I learned NOTHING, far from it, but it definitely wasn’t a path to become fluent in BASIC

  • simonw an hour ago

    I’ve long believed that the best way to learn anything in tech is to attempt to try to build something with it.

    My hunch is that people who use the process you are describing will still get a massive leg-up in learning skills like web development.

    Often it isn’t a choice between using AI-assistance to get some working vs spending 20 hours figuring it out from scratch: it’s a choice between getting somewhere with AI or not doing the project at all, because life is full of things to do that are more rewarding than those 20 hours of frustration.

    Anecdotally, I’ve heard from a bunch of people who always wanted to learn software development skills but were put off by the steep initial learning curve before you see any concrete progress… and who are now building useful things and getting curious about learning more.

  • drdeca 11 minutes ago

    “In the Phaedrus, writing is the pharmakon that the trickster god Theuth offers, the toxin and remedy in one. With writing, man will no longer forget; but he will also no longer think.”

    When I have to navigate to somewhere I haven’t been before, I generally do not read a map, but follow instructions from some navigation software. As a consequence, I often don’t really know where places are, just the route I take to get to a destination. With GPS navigation, I do not get lost, but neither do I have much awareness of how locations are spatially arranged.

    Such technologies seem to always be like this.

    A potion which removes a difficult task, but also dulls the ability to do such tasks oneself.

    It is like that one SMBC comic https://www.smbc-comics.com/comic/identity “ Humans offloaded memory to books, then thought to computers. Now, we're offloading our desires to the network. All that remains are basic bodily functions, which well offload in another generation or two. At that point, well just merge into one united entity so, it all works out.”

  • sibeliuss an hour ago

    As someone who already has skills in backend / frontend, AI tooling has made me fearless in terms of new material. I couldn't type it out by hand, but by getting something working through a (much faster) trial and error process, I'm learning so much! I suspect this is your case as well. There's a lot of learning going on underneath, which will only improve your abilities in ways that will come back as astonishingly beneficial if you keep working on your project.

  • furyofantares an hour ago

    You also didn't learn modern webdev for the years you'd been meaning to.

    You're actually better poised to learn it now if you care to, now that you have a component you care about that already works that you can work from. Of course maybe you won't, maybe having GPT there will indeed prevent you from ever learning it, I don't know.

  • bdlowery an hour ago

    You skipped all the hard parts, all the struggling, and now you have a working product without a mental model and can't level up to doing harder things on your own. Struggling IS learning. You didn't try different paths, piece different info together, and then eventually create a mental model. You just used ChatGPT to skip to the end result.

    It's like enrolling for a Calc 2, cheating on all the homework to get an A, and saying "did i learn anything? No, but it solved all of these annoying homework problems for me!" Now when you have to take the 1st exam you're screwed because you didn't learn anything.

vunderba 8 minutes ago

The danger in the eventual ubiquitous availability of large language models (LLMs) isn't necessarily that they can seemingly answer any question.

The real issue arises when it becomes far too tempting to immediately turn to an LLM for an answer, rather than taking a few moments to quietly ponder the problem on your own, engaging and manipulating, exploring different angles, etc. This kind of abstract thinking is a craft that only improves with consistent practice and deliberate effort.

aithrowawaycomm an hour ago

This condescension is very common and very irritating:

> Q. You say that the best experts of the future will be those who make the most use of AI. Are people who are waiting to use AI making a mistake?

> A. I get it, it’s an unnerving technology. People are freaking out. They’re getting a sense of three sleepless nights and running away screaming. It feels like an essential threat to a lot of careers. I think if you’re a good journalist, the first time you think, “oh no.” But then you start to see how this could help you do things better than before.

There are a lot of white-collar jobs where LLMs do more harm than good because a 1/4 hallucination rate means you waste too much time on wild goose chases. I briefly thought GPT-4 was useful for finding papers given a description of the results - I “kicked the tires” with some AI research and was very impressed. But when I tried to find papers on animal cognition, about 75% of the results were fictional, though supposedly authored by real animal cognition experts. And GPT-4o is even worse! The tools are just not good enough for my use case; Google Scholar is far more reliable.

I just don’t understand the childish motivated reasoning behind assuming the skeptics are scared. Maybe if I spent “three sleepless nights” talking to ChatGPT I would be more enlightened.

  • simonw 44 minutes ago

    > I briefly thought GPT-4 was useful for finding papers given a description of the results.

    That’s one of the many poorly documented traps of LLMs: trying to use them to find papers like that is a fast-track to worthless hallucinations. If that was one of your first experiments I can’t blame you for thinking this tech is “more harm than good”.

    LLMs are terrible search engines… except for the times when they are great search engines!

    Learning when and what to use them for continues to be a significantly under-appreciated challenge.

karaterobot an hour ago

> Q. Isn’t it inevitable that AI will make us lazier?

> A. Calculators also made us lazier. Why aren’t we doing math by hand anymore? You should be taking notes by hand now instead of recording me. We use technology to take shortcuts, but we have to be strategic in how we take those shortcuts.

An unpopular opinion I have is that most of the doomsaying about technology making us dumber is true. Yes, even back to Socrates. I won't say all, but I'd safely say a lot. What happened was that we developed tools, lost certain capacities without necessarily losing the capabilities that came with them, and redefined the level a normal human should function at. My only point is that people don't like to think that maybe they themselves are less intelligent—in many ways—than people who urinated outside and didn't know what the sky was. But I don't see how it could be any other way. When we say things like "I don't need to remember, I can write it down", and "I don't need to do arithmetic in my head, I'll let a calculator do it", or "I don't need to read the article, someone will explain it in the comments" we are accepting the consequences of that, good and bad.

  • aniviacat 31 minutes ago

    > redefined the level a normal human should function at

    To a level much higher.

    We stopped doing many repetitive, tedious things, but in return moved to things that are way more abstract and complex.

    And that's happening everywhere. Even farmers are getting ever closer to being full on system architects.

    Oh, you didn't learn to do quickly calculate square roots in your head? Instead you spent that time on learning about relativity in high school physics class.

    By calling the people of the past smarter, you are really underselling the amount and depth of abstract though happening everywhere today.

  • master_crab 40 minutes ago

    People aren’t dumber, or smarter. They just focus on what’s the next important thing to tackle.

    For example, I doubt any website programmer knows the circuitry, assembly code, OS level calls, networking, etc, that make any webpage element do anything. Let alone can sit there and calculate any of the mathematical requirements needed to do any of that. But they know how to use an IDE and a framework like React.

    All this is a long way of saying:

    …on the shoulders of giants. AI is just the new tool needed for the next step up.

    • aliasxneo 14 minutes ago

      Yeah, I'm not sure I understand how moving up an abstraction layer necessarily makes you less intelligent. I also don't think it makes you smarter. For the average individual, I feel like it simply moves your "cursor" up the stack, but you're not necessarily increasing your context window.

      Perhaps the confusion comes from the fact that we often produce more complex things as we move up layers. It's then assumed that the people who made them must be more intelligent, but as I said, I don't think that's a fair assessment.

      I would say the real measurement for intelligence here is how much of the abstraction layers you actually understand. In other words, can you move your cursor back down the stack and operate just as well as in the higher layers? Can you do this while unifying the complex interactions between each layer into a cohesive model? I've noticed that even AI tends to be pretty bad at this last step. It often takes prodding to get it to see the subtle errors often introduced when working with complex systems.

lolinder 11 minutes ago

The headline implies something other than what the interviewee is saying:

> Q. You don’t like to call AI a crutch.

> A. The crutch is a dangerous approach because if we use a crutch, we stop thinking. Students who use AI as a crutch don’t learn anything. It prevents them from thinking. Instead, using AI as co-intelligence is important because it increases your capabilities and also keeps you in the loop.

> Q. Isn’t it inevitable that AI will make us lazier?

> A. Calculators also made us lazier. Why aren’t we doing math by hand anymore? You should be taking notes by hand now instead of recording me. We use technology to take shortcuts, but we have to be strategic in how we take those shortcuts.

ethn 12 minutes ago

‪ZIZEK: that AI will be the death of learning & so on; to this, I say NO! My student brings me their essay, which has been written by AI, & I plug it into my grading AI, & we are free! While the 'learning' happens, our superego satisfied, we are free now to learn whatever we want‬

renewiltord 2 minutes ago

Highly useful tools are always described like this. Amusingly, search engines and then, to a lesser extent, Stack Overflow were both described like this. I can’t say that it’s very interesting a statement in its nth incarnation.

_tk_ 2 hours ago

Very misleading title. From the article:

„The crutch is a dangerous approach because if we use a crutch, we stop thinking. Students who use AI as a crutch don’t learn anything. It prevents them from thinking. Instead, using AI as co-intelligence is important because it increases your capabilities and also keeps you in the loop.“

  • ksd482 2 hours ago

    I feel like this is exactly what the title is conveying. What’s misleading about it?

  • fgbnfghf 2 hours ago

    I use AI to ask questions when I am not totally sure what the question is, and it is very helpful for narrowing that down. It can be powerful as a tool to help get your foot in the door on new knowledge. Just like google search there is a correct way to use it and an incorrect way.

    Another thing to consider is the motivation of companies like OpenAI. Their products are designed to be used as a crutch. Their money is in total reliance on the product.

Asraelite 30 minutes ago

"Free browsing by accepting cookies" or "subscribe and decline".

Is this legal? It might be, but I've never seen other sites do it so it seems dubious.

bachmeier 28 minutes ago

This type of article is so frustrating. "You need to use AI to make yourself more productive." Followed by zero explanations of how I can do that. In addition, no mention of the implications of sending all your personal information to an entity that is waiting for an opportunity to use it against you.

EVa5I7bHFq9mnYK 13 minutes ago

Students who use assembly language as a crutch, don't learn the machine codes properly.

pier25 an hour ago

The less we learn the more stupid we'll be. The brain, like any muscle, atrophies if you don't use it.

This is already happening with genz in college.

https://www.theatlantic.com/magazine/archive/2024/11/the-eli...

  • pavel_lishin an hour ago

    I'd wager you can find a copy of an article with that premise about every generation going back to a generation after the invention of writing.

  • StefanBatory 23 minutes ago

    On the article itself -

    >> Lit Hum often requires students to read a book, sometimes a very long and dense one, in just a week or two.

    I am not surprised by that. You have other classes as well, reading book for class is not like reading it for yourself - you have to research context, take notes, pick up themes, and so on. Also, for this university in question, aren't Literature courses supposed to be taken by everyone? I don't imagine Maths or CS student approaching that with much enthusiasm (and in the opposite too)

Smithalicious an hour ago

Damn kids, back in my day we had to copy-paste our homework from stackoverflow uphill both ways

pluc 2 hours ago

Technology that renders effort and research pointless makes people lazy and stupid, story at 11

  • throwaway918299 an hour ago

    I’m sure there were people that said the same thing about Google. I’m pretty sure they even said the same thing about the written word.

    • yoyoyo1122 an hour ago

      It's such a "Back in my day..." mentality.

      "Farmers today are much less skilled and knowledge than farmers 50 years ago!"

  • gomerspiles 2 hours ago

    I suppose a headline that doesn't have much to do with the content is also such a technology?

brookst 20 minutes ago

I’m old enough to remember when the same claims were made of graphic calculators and high lever languages like Basic.

A better formulation is perhaps “students who use AI to reduce work learn different things”. It’s easy for purists to say there’s no value whatsoever in learning to use a tool rather than learning to do the work.

But that’s a judgment about the value of what is learned, and it’s kind of dishonest slight of hand to substitute that opinion.

fnordpiglet an hour ago

If you use a crutch you won’t learn anything is an age old truism. I don’t know why we would think a new tool somehow changes that dynamic.

My daughter is 10 and she is learning factoring, long division, and other things that a calculator does very well with. But she’s not allowed to use it at this stage because she can’t learn while using a crutch.

She’s also learning to write essays. She writes her essays then puts them into ChatGPT and asks for analysis, feedback, explanatory revisions. Then she revises the essay on her own without being able to refer back to the advice. This is using AI as a complement to learning and it’s been remarkably powerful. She can get feedback immediately, it’s high quality and impartial, and she can do it as many time as she finds useful. So, the fundamentals of learning don’t change no matter how powerful or different the tools become. But ignoring the tools because they can be used in place of learning if used in place of learning is dumb.

  • ryandrake an hour ago

    > My daughter is 10 and she is learning factoring, long division, and other things that a calculator does very well with. But she’s not allowed to use it at this stage because she can’t learn while using a crutch.

    Which seems silly to me, but what do I know, I'm not a teacher. Nobody does long division in real life after K-12 school. It is not a useful skill to have, and it is not a useful concept to know. If I have to divide two numbers I just use a calculator like 99% of the humans on the planet.

    Knowing what division is, and what it means to divide one number by another is valuable, but can you just teach that without teaching the mechanics of "divide the partial dividend by the divisor, then multiply the partial quotient by the divisor, and subtract from the partial dividend, extending to the next blah blah blah blah"? Are we really training the next generation for a world without electricity?

    • ben_w an hour ago

      > Nobody does long division in real life after K-12 school. It is not a useful skill to have, and it is not a useful concept to know.

      To my surprise, when I did pure maths at A level I found the same ideas applied to dividing one polynomial by another.

      Of course as a mere memorisable algorithm a computer can also do this, so I'm not sure how useful it is even to pure mathematics, but there is (or was, 20 years ago) some use to the idea.

    • nasmorn an hour ago

      Maybe learning to follow a simple algorithm helps some children to structure their thinking. Divisions are not hard to do. Doing a lot of them has little benefit though.

nonrandomstring an hour ago

> don't learn anything

I don't think this is true. We learn a lot: Deference. Dependency. Entitlement. Impatience. Conformity. Distraction. Overconfidence. Intemperance...

If something "does the thinking for you" it has a much deeper effect than simply being a "crutch for the mind". It changes our relation to the world, to knowledge, motives, ambition, self-control...

"AI" is going to change our minds, but from what I've seen so far the outcome is a really quite awful kind of person, a net burden to society rather than a creative and productive asset.

iamleppert 2 hours ago

<Shrugs in passive aggressive>

That’s what they said when the calculator was invented. Out with the old, in with the new! Sorry but not sorry life was so hard before but we got AI to do the work for us now.

  • giantg2 2 hours ago

    Not the same at all. The results of calculators are verifiable. The results of AI can be dangerously wrong without easy validation. Calculators don't eliminate the application of concepts from learning, but AI does.

    • ben_w an hour ago

      The results of a calculator are only easy to verify with either someone who can do the same arithmetic or who has another calculator.

      I had this happen to me once while shopping, where I could immediately tell that three items costing less than £1 each should not come to a total of more than £3, but the cashier needed that explained to them.

      (And that's aside from anything about asking LLMs to output in a format suitable for mechanical validation, which they can generally do).

    • parpfish 2 hours ago

      Verifiability is part of it, but the other part is that calculators don’t provide a full end-to-end solution (unless you’re doing worksheets for homework).

      Each calculation is just one step, it’s up to the user to figure out which steps to take and how to chain them together. they might even learn that the whole thing would be faster if they could do some of those calculations their head.

      like if you’re trying to figure out how much wood to buy for a deck, you’d still need to break the big problem down into those individual computations to do in the calculator. Unlike an llm where you could just ask it and it’d jump straight to a final answer

    • rthrth45y 2 hours ago

      Good point but also not a new problem. Humans use the same mechanics to assign variable weights and biases in order to validate information. If you look at a banana, you can determine what it is based on the knowledge you contain, and that process triggers a similar cascade of weights. the difference is brains are much more capable of this, but we still misremember things and recite wrong information. The game of telephone is a great example.

      I don't think AI eliminated the application of concepts from learning. I think that has been eliminated enough due to the erosion of our public education systems. If we are not capable of critical thought with the information our own peers present us, why would it be any different when we seek it from AI?

  • givemeethekeys 2 hours ago

    My friends in a top tier high school had no intuition about numbers because they used calculators for everything, including basic addition and subtraction.

    Some of them had customer service jobs where they became utterly confused if the change was 99 cents and the customer gave them an additional penny.

    • pclmulqdq an hour ago

      Everyone crying "calculator" about ChatGPT has me convinced that ChatGPT is bad for your education. Learning to do mental math as a child sucked, but now that I can do it, my brain is so much more free to think about stuff that matters. There's no intervening step of "let me pull out a calculator to see what that is," I just know the answer. The thoughts can just flow freely.

      • SoftTalker 24 minutes ago

        This is true of all memorized facts. It enables thinking at a higher level. “Why should I learn multiplication when I have a calculator on my phone” ignores this.

        The more you have memorized the more nimble your thinking is. If you have a large vocabulary you can effortlessly express yourself with precision while others are thumbing through a thesaurus (or these days asking an AI to “rewrite”).

        If you know the history of something you can have more interesting perspective and conversations about it.

        There is almost no situation where the person with a lot of memorized knowledge is at a disadvantage to the person who needs to look everything up or rely on tools to do the work.

        • skydhash 6 minutes ago

          True. I have friends that have refused to learn algorithms and instead insisted that they only needed to master $FRAMEWORK. Then they got stumped by any problems that can not be solved by $LIBRARY, spending days on it with no result.

          Yes, it takes time, but learning is exponential, and overtime, the pace will increase greatly.

      • ben_w 38 minutes ago

        That's true for every skill that comes fluently. We've only got limited time, which skills really matter?

        I'd say yes to basic arithmetic; but I can't really use my own experience as a software developer who started off in video games to justify why a normal person needs to understand trigonometry and triangle formulas, any more than I can justify why they need to study Shakespeare and Alfred Tennyson over e.g. Terry Pratchett and Leonard Cohen — "I find it intellectually stimulating" is perhaps necessary, but certainly not sufficient, given there's more to learn than we can fit in a lifetime.

    • mistrial9 2 hours ago

      except that the typical case is ... the charge is $1.01, I give you a $5 and a penny. A penny from the client to the house on top of a charge of $0.99 by the house, does xxxxxxxxx ... (edit) as pointed out below, a penny plus a $0.99 charge means that the cashier can return a whole number of bills, avoiding any coins..

      • abanana an hour ago

        The change, not the charge. If the change is 99 cents, the customer gives an extra penny, the change is now 1 dollar, avoiding a handful of coins.

        It seems to have become the norm for young cashiers to be unable to understand. And if you try to explain, they'll insist "I can't change it now I've rung it through". Some seem to think the system keeps an exact record of the quantity of each individual coin (or they just don't even know where to begin to think about it).

        • Dalewyn an hour ago

          Another possibility: "The register says 99 cents change and I am not paid enough to give any more of a damn than that."

          • pessimizer 27 minutes ago

            If you have basic comfort in arithmetic, this is not a calculation that involves giving a damn. Being confused about why someone would give you an extra penny and having a discussion about it with a stranger burns 100x more calories than knowing it. If basic arithmetic involves taking a deep breath and closing your eyes for half a minute, or looking around the room for a calculator, that's a different cost/benefit analysis.

            It's like the difference between a language you are fluent in and a language you are tentative in. If you're fluent, you have to make an effort not to listen to somebody's loud conversation, or not to pay attention to a billboard. They intrude into your consciousness. There's never a situation when I don't do simple arithmetic when exposed to it. I don't have to consciously figure out what 4 times 9 is. Subjectively, the number just pops into my head when I see the question.

            edit: If you can't do this with explanations of identities or related rates, etc., it's hard or impossible to follow any quantitative or especially probabilistic argument. Even the simplest ones. I think this results in people for whom arithmetic is difficult faking it by trying to memorize the words used during quantitative arguments without having any real understanding. Just sort of memorizing a lot of slogans and repeating them during any argument that shares similar words. I think discomfort with arithmetic ruins people politically (as citizens), so I really do think calculators are a problem.

  • s0ss an hour ago

    Lots of nuance that you’re not addressing, IMO. Here’s some more nuance:

    Steroids. It’s not a perfect metaphor, But I think it’s useful. Two people are trying to gain muscle mass. They both have an ideal starting point. First person has a healthy diet, lots of exercise, and sleep. The second person has all of the same things the first person however they also taking growth hormones.

    Lots of folks look at the two results and will see lots of different things. Beauty is in the eye of the beholder I suppose. If you think the end results of the work should yield sculpted bodies with larger than normal muscles… you might opt to use hormones. However, if you think sculpted bodies with larger than normal muscles looks unrealistic or just not your style/goal… you would probably opt for a more natural approach.

    Both have their merits and could be described as “fit” despite their differences. folks may value one over the other. people might fantasize about looking like thor, but if everyone actually looked like thor, things would be weird. My two cents: Thor is fiction, and while we need fiction. Im not going to pretend that anyone should look like thor in order to be in shape or to be described as fit. If we allow ourselves to be fooled into thinking that it can be normal to look like thor, then we are doing something wrong. Fiction should not become reality.

  • StefanBatory 2 hours ago

    No, I can't agree with you here.

    I'm software engineering student. I had a phase year ago where I was using ChatGPT a lot, a lot more than I ever should have.

    And it messed up with my brain a lot. I felt I became utterly lazy; to the point where quick fixes that should have taken me like, 10-15 seconds (?) I had to do with AI, which often took a very long time.

    And the point of studying is to learn. You won't learn anything if you have someone else write your software for you.

  • wredue an hour ago

    A new study also shows people using AI produce code with 41% more bugs. And that’s just what the users missed!

    Calculators arent giving you “kind of correct” answers.

  • blibble 2 hours ago

    the difference is I still understand everything the calculator can do, and can do it by hand on paper

    the AI generation is not going to know how to do anything other than type into chatgpt

    at which point human progress ends and we start going backwards

    • SoftTalker 17 minutes ago

      And worse, the people who control the AIs now have ultimate power to rewrite history and mold opinion. If they want everyone to think the earth is flat they can do it.

  • lawn an hour ago

    > That’s what they said when the calculator was invented.

    And it's beneficial to ban calculators for learning, which is the point of the article?

  • nonrandomstring an hour ago

    > <passive aggressive> > life was so hard before but we got AI to do the work for us now.

    Aggression against what? Yourself?

    I think you show a tragic misunderstanding of technology and what it is doing in the world. It's not the work that it's doing for you. It's the living. Is it really your life you want a machine to take?

    Nobody wants to "work". Henry David Thoreau said ,"There is no more fatal blunderer than he who consumes the greater part of his life getting his living." All good, no? But that's not what "AI", in the hands of exploiters (or even yourself, as a self-exploiter) is going to do to you. Technology is more "productive" but creates more, not less labour.

    Better to heed Max Frisch who said, "Technology is the knack of so arranging the world that we don't have to experience it." Would you employ a machine to enjoy a music concert for you? To have sex for you or play games for you so you're not troubled by the effort?

    • Dalewyn an hour ago

      >To have sex for you or play games for you so you're not troubled by the effort?

      Two of the three games I play on a daily basis largely play themselves, so... yes, actually. I still have plenty of fun watching them.

      • nonrandomstring an hour ago

        That is very interesting. Are you talking like city simulation games? I get the entertainment of quite passively tweaking and watching things unfold. But at what point would you say "hey I'm not really a participating player any more, this is just watching TV"? Is it still a game at that point?