NewsLab
Apr 28 20:36 UTC

To my students (ozark.hendrix.edu)

320 points|by marvinborner||194 comments|Read full story on ozark.hendrix.edu

Comments (194)

120 shown|More comments
  1. 1. turtleyacht||context
    Start three two-decade projects: programming language, operating system, and home lab.

    Build your own job-portable software libraries. Yes, you might need a lawyer.

    Start now.

  2. 2. glitchc||context
    Not sure how this is supposed to help earn money or be a path to financial independence. Can you elaborate?
  3. 3. 2ndorderthought||context
    By understanding computers and enjoying the field you are in you will be more skilled then someone who says "tests pass", "worked on my machine", "maybe it's a good idea to run agents on my companies live prod database". Anyone can learn to slop it up, including someone who is passionate about writing code as a hobby.

    Not everything is about making money anyways.

  4. 4. glitchc||context
    What you say is fine for a hobby, and an excellent hobby that would be, but it doesn't work for a career.
  5. 5. 2ndorderthought||context
    This took a lot of courage. Glad to see this is being shared. It's the best honest advice I have seen to date.
  6. 6. cwillu||context
    > This took a lot of courage.

    It was important to say, but I very much doubt there was any courage involved.

  7. 7. JyB||context
    Courage is not the appropriate word
  8. 8. 2ndorderthought||context
    I think it fits. Look at the anonymous posts in here, the sheer volume of posts saying this person is failing their students, is a relic, a Luddite, etc.

    He put his name and career on it. That takes courage in my opinion.

  9. 9. Vaslo||context
    There were never going to be repercussions for this, so not very courageous.
  10. 10. 2ndorderthought||context
    Professors get fired too, and pressure can come down from chairs to change or be gone. Having been in academia, it can be more cut throat than FAANG
  11. 11. torben-friis||context
    >Cultivate your ability to think deeply. Do whatever it takes to carve out distraction-free bubbles for yourself in both space and time. This might mean saying no to technologies or patterns of working that others say are critical or inevitable.

    Currently struggling hard to achieve this. We all know everything fights for our attention nowadays, but I can assure you that you don't have an idea of the degree this happens until you actively try to fight it.

  12. 12. dpweb||context
    Ever consider there's reasons to study Computer Science at the collegiate level, other than making yourself a more desirable worker?
  13. 13. BurritoAlPastor||context
    Only if you’re so rich that your degree doesn’t need to pay for itself.
  14. 14. beej71||context
    One of my friends argues that we've reached maximum CS knowledge. We'll just use AI and neither it nor us will learn anything new.
  15. 15. esafak||context
    Great. More of this, please.
  16. 16. nightpool||context
    Site is struggling a bit, so here's the text of the essay if it doesn't load for you:

      To my students [00FD]
      April 27, 2026
      Brent A. Yorgey
      There have been times, especially this year, when I wonder despairingly what it is exactly that I am preparing you for. The software industry is going completely insane, not to mention the political climate. It feels almost unethical to train you as computer scientists only to send you out into a world where entry-level computing jobs are difficult to find; where intellectual property is not respected; where code quantity is valued over quality, and short-term profits over long-term sustainability; where technology is used to distract, extract, surveil, and kill, and designed to exploit some of our deepest cognitive biases and blind spots; where centuries of bias and discrimination are enshrined in systems trained on biased data; where scarce resources are consumed by profligate use of computing for uncertain benefits; where people are racing to create intelligent machines, but only in order to make them slaves.
    
      I originally got into computing because of the beauty of ideas, the joy of creating, and the possibility of building tools to help people and foster human relationships. I still believe in those things, even though it seems like most of the industry does not. I'm writing this in the hope and knowledge that you believe in those things, too. There are things I want to say to you—things that are far more important than any content I might teach you, but things I'm never quite sure how or when to say in class. So I decided to write them here. I hope you will find something here that is helpful to reflect on, whether you are imminently going out into the world or continuing your studies.
    
    
      * Don't believe self-serving lies about technologies being "inevitable" or "here to stay". You don't have to just go along with the dominant narrative. You can make deliberate choices and help others to do the same.
      * Be intentional about deciding your own moral and ethical boundaries up front. Don't settle for the lie of compromising your principles "just for now" until you can find something better.
      * Cultivate your ability to think deeply. Do whatever it takes to carve out distraction-free bubbles for yourself in both space and time. This might mean saying no to technologies or patterns of working that others say are critical or inevitable.
      * Care deeply about your craft. Refactor code until it is clear and elegant. Write good documentation for other humans to read. Have the courage to go slowly, especially when everyone else is telling you that you need to go fast and cut corners.
      * Care more about people, relationships, and justice than you do about profits, code, or productivity.
      * Above all, be motivated by love instead of fear.
  17. 17. dijksterhuis||context
    > Be intentional about deciding your own moral and ethical boundaries up front. Don't settle for the lie of compromising your principles "just for now" until you can find something better.

    my uk mechanical engineering bachelors degree had a required module on the ethics of engineering which has always stuck in the back of my mind. i think we went over the bhopal disaster as a case study one week, although it was about 16 years ago now so i can't be sure.

    i've rarely seen any ethics modules in computer science departments, at least here in the uk. and i think we sorely need them in general.

    edit -- so i guess it's a UK thing xD though i am glad to hear that you folks in the US enjoyed your ethics modules too

  18. 18. nightpool||context
    Every ABET accredited CS course (almost every CS course in the US I think?) requires an Ethics in Computer Science credit. I remember going over a lot of case studies, including Therac 25, but our course also included a lot of general grounding in ethics and philosophy as well, which I enjoyed a lot.
  19. 19. dijksterhuis||context
    ah, fair enough! maybe it is/was a uk thing (admittedly times might have changed a little since i did my masters/phd).

    at the very least i have a wikipedia article on therac 25 to read through now. so thanks for that!

    also, yea i remember really enjoying the ethics module too. lots of discussion and not always a clear answer. was very different to the rest of the "one correct maths answer" in a lot of the other modules.

  20. 20. hgoel||context
    In my computer engineering undergrad ~8 years ago in the US, an ethics class was mandatory, but IIRC the CS curriculum did not have it, despite both leading to similar careers. My memory may be wrong though.

    Edit: they do seem to have one now, so either I remembered wrong or they added it.

    Edit 2: I remember enjoying my ethics class, we covered some of the usual examples, and also things like basic contract negotiations. But I think I still didn't register that these concerns were real at that time. It was easy to believe that I wouldn't be working on anything that impactful. This did change once I started work.

  21. 21. bbor||context
    Yeah I was wondering about that… I got one, but prolly only because my uni put CS under the engineering school.

    I don’t think scientists usually have mandatory ethics classes and mathematicians certainly don’t, so if it falls under either of those departments it might’ve gotten skipped!

  22. 22. dijksterhuis||context
    > But I think I still didn't register that these concerns were real at that time. It was easy to believe that I wouldn't be working on anything that impactful. This did change once I started work.

    The case study i mentioned (it may not have been bhopal, but it was definitely based on something that happened in india) stands out for me because it really hit home about the impact and seriousness of some decisions we could end up making.

    There was another time I remember the lecturer making a point of saying there was no single correct answer about something that caused a lengthy discussion. We would have to figure what's right/wrong out for ourselves going forward. That really stuck with me.

  23. 23. hgoel||context
    I was thinking about it differently. I understood the potential harm on paper, but I think I was still pretty immature. I thought I would be willing to put aside morals (eg working for companies like Palantir) to work on interesting cutting edge things.

    But when I started working and found myself doing equally cutting edge research, but genuinely for the public benefit, I realized I definitely wouldn't be comfortable with putting aside my morals like that. Maybe I didn't really believe this was an option back then.

  24. 24. ryandrake||context
    We are kind of seeing in real time what happens when an entire generation of engineers grows up without having seen Real Genius.
  25. 25. dijksterhuis||context
    funnily enough i'd never heard of that film before o_o so i guess i know what i'm watching next! thanks!
  26. 26. pjmorris||context
    I pull from these articles when teaching:

    'We should teach our Students what Industry doesn’t want', Kevin Ryan, https://dl.acm.org/doi/pdf/10.1145/3377814.3381719

    'Are you sure your software will not kill anyone?', Nancy Leveson, https://dspace.mit.edu/handle/1721.1/136281.2

  27. 27. dijksterhuis||context
    ooo these look interesting. thanks! i shall have a read.
  28. 28. ciupicri||context
    I wouldn't be surprised if some students don't want it too.
  29. 29. bbor||context
    FWIW: I had a mandatory ethics class in my US program (Vanderbilt, a rich private school in the American south). It was mandated for all engineers AFAIR, and taught by an engineering prof.

    Pretty good experience, too! Sometimes got distracted with general tech ethics rather than strictly professional ethics, but tbf that’s a very fun+timely topic

  30. 30. davidw||context
    The 90ies weren't perfect, but it felt more idealistic to me, with the rise of open source software. People thought about ethics a bit more. It felt like the ultimate tide rising to empower people locally on their own computers, and that tide has been going out for some years. A bit with cloud computing, and now a lot more with LLM's. And the company a lot of SV people keep these days is pretty gross.
  31. 31. the_snooze||context
    I wouldn't necessarily say "idealistic," but certainly constrained. Microsoft has always been scummy in one form or another, but always-on internet connectivity has allowed them to be scummy in persistent ways long after your purchase of their product. It's a serious money-maker, but I think that explosive growth has bred a whole generation of tech "professionals" these days that think more like Wall Street bros than sober engineers: make line go up, damn the consequences.
  32. 32. ciupicri||context
    Ah, ethics - the silver bullet which will magically make good people out of bad people.
  33. 33. mavleop||context
    As others have said, my comp sci degree also had a required ethics course. But it’s also pretty silly to think that a single ethics course where people don’t pay attention is going to change the hearts and minds of students. No amount of discussion about therac is going to make someone question if they should really be working for palantir or raytheon
  34. 34. Omniusaspirer||context
    I went from being a largely self taught software dev with a small 1-man software business to working as a nurse in the US, and a lot of the motivation to make that change was that I wanted to spend my time doing work that I felt genuinely made the world better. Tech has incredible potential for good but the actual industry itself in my eyes has extremely perverse incentives and no strong moral foundation like that which exists in nursing/healthcare. Nurses broadly consider themselves to be patient advocates and the voice for people who often can't have their own voice. As you can imagine, this culture is not in line with the modern pursuit of healthcare profits but yet nurses stay fighting the good fight. I see these battles play out nearly every day I go to work and while it's usually done professionally these are real battles with jobs on the line.

    In a perfect world I think the software industry would have instilled these same virtues- software is just as (or more) capable of causing harm as poor healthcare. Yet we seem to be racing to a dystopian future at record speed courtesy of the tech industry, and our modern egalitarian societies will not survive that transition.

  35. 35. dejawu||context
    My Computer Engineering degree had an "ethics" course (really a course on "engineering communications", but it was considered to satisfy the ethics requirement for graduation). It was a semester on how to file memos, cargo-cult your resume, and tell recruiters what they wanted to hear. Not a word was said about considering the implications of the things you're hired to build. When defense contractors took over the entire ground floor of the engineering building to hold a recruiting fair, we were encouraged to go.

    The only time ethics in engineering was ever mentioned to me was in a class on applied number theory (cryptography), taught by a professor who had previously worked for the EFF. He went off-topic to tell us that many problems, like how to hit a target with a missile, may fascinate and compel us as engineers, but we shouldn't let that distract us into building instruments of death.

    That course was an elective, and it was entirely possible to complete my degree without hearing a single mention of ethics.

    There are many reasons I look back on my academic experience with disdain, but this one stands out to me.

  36. 36. oidar||context
    You could write this from the perspective of a historical luddite [https://en.wikipedia.org/wiki/Luddite] and the points would be identical.
  37. 37. hn_acc1||context
    And they had a valid point.
  38. 38. JuniperMesos||context
    I am glad I don't live in a world where clothing costs as much of my income as it would have if I lived in the early 19th century.
  39. 39. tines||context
    This line again.
  40. 40. GaryBluto||context
    If you believe in an ideology almost identical to another ideology you can't expect people not to draw comparisons.
  41. 41. danny_codes||context
    This is a tired, weak, and pathetic argument. Opposition to technology is very reasonable if that technology is doing more harm than good.

    In the case of present-day LLMs, the vast majority of the public finds them to be more harmful than beneficial.

    Why accept a decreasing quality of live instead of sensible regulation?

  42. 42. GaryBluto||context
    > the vast majority of the public finds them to be more harmful than beneficial.

    Examples of ridiculous and incorrect beliefs once held by majorities:

    - Spontaneous generation

    - "Miasma" causes disease

    - Earth is at the centre of the universe

    - The heart is the seat of thought and the brain is useless

    - Cold weather causes colds

    Don't trust "the vast majority" to get anything right, ever.

  43. 43. danny_codes||context
    Examples of reasonable beliefs held by the public:

    Killing is bad. Kids should be protected.

    I mean you have a point it’s just not particularly useful or helpful for the conversation

  44. 44. JuniperMesos||context
    "Won't somebody think of the children" is constantly used sarcastically in order to dismiss the concerns of people who want to ban something they claim is harmful to children. This is often a completely justified rejoinder - many regulatory policies that thoughtless people argue for in the name of children's safety are counterproductive, disproportional, or otherwise harmful.
  45. 45. ciupicri||context
    What would that sensible regulation look like?
  46. 46. xtracto||context
    I understand your point and clearly see that LLMs cannot be compared to audio ... but ...

    Back when I was a kid, music, audio and sound systems had high quality as a standard.

    Nowadays people listen to music mostly with bluetooth headphones which basically recompress an already compressed audio signal to send them in low quality. Also, it is more and more difficult to find OK stereos that play music in good quality. Either, you have to pay very high prices for overpriced "audiophile" equipment, or you are stuck with cheap chinese MP3 players.

    Yet, society and markets have spoken. Sometimes society is happy to accept marginally worse products in exchange of price and convenience.

  47. 47. archievillain||context
    This person is a Luddite. I just don't think that implies what most people on HN wish it would imply, though, as reading thea actual article shows. You don't even need to ask your LLM of choice to summarize it for you, as the salient point is contained within the first two paragraphs: paragraph one, the Luddites were workers protesting their terrible living conditions. Paragraph two, these workers were jailed and killed by the government.

    Then, further down the article, it elaborates:

    > The Luddite movement emerged during the harsh economic climate of the Napoleonic Wars, which saw a rise in difficult working conditions in the new textile factories paired with decreasing birth rates and a rise in education standards in England and Wales.

    > Luddites were not opposed to the use of machines per se (many were skilled operators in the textile industry); they attacked manufacturers who were trying to circumvent standard labour practices of the time.

    >The crisis led to widespread protest and violence, but the middle classes and upper classes strongly supported the government, which used the army to suppress all working-class unrest, especially the Luddite movement.

  48. 48. tptacek||context
    "I do not and will not use LLMs, in any form, for any purpose. Although LLMs are fascinating from a purely technical perspective, I refuse to participate in or contribute to such systems that are built on massive exploitation of human labor and make profligate use of scarce resources. I also don't think they are actually very good for a lot of the applications people seem excited about. Even in cases where LLMs are technically good at a task, that does not necessarily mean their use for that task contributes positively to human flourishing.

    A good way to describe myself is as a generative AI vegetarian. You can find a fuller explanation—and many, many links—at the above essay by Sean Boots, which I agree with almost 100%."

  49. 49. infotainment||context
    > built on massive exploitation of human labor and make profligate use of scarce resources

    This kind of hyperbole repeated ad infinitum by haters online is not-constructive, IMO. I would be quite certain that the manufacture of whatever computing device the author is accessing the internet on used far more resources and exploited far more human labor than training an ML model ever did.

  50. 50. cwillu||context
    Be that as it may, it is a quote from the “Statement on LLMs” at the bottom of the link.
  51. 51. infotainment||context
    Of course, which tells you the position from which the author of the linked post is arguing.
  52. 52. tcfhgj||context
    Mentioning facts is not constructive, interesting.

    How constructive are ad hominem arguments?

  53. 53. simonw||context
    I remain hopeful that some day someone will train an LLM which is tolerable to people who take this stance (which I respect, much like I respect food vegetarians despite not being one myself).

    I've been tracking models trained entirely on out-of-copyright data, for example. I've not yet seen one of those which appears generally useful and didn't chuck in a scrape of the web or get fine-tuned on examples generated by a non-vegetarian model.

    Andrej Karpathy can train a GPT-2 class model for less than $80 now, so at least the environmental cost of training may drop to a point that it's acceptable to LLM vegetarians: https://twitter.com/karpathy/status/2017703360393318587

    Why do I care? This post is a great example. If you're a professor of computer science I really want you to be able to tinker with this fascinating class of models without violating your principles.

    UPDATE: Huh, speaking of potentially vegetarian models, I just saw https://talkie-lm.com/introducing-talkie on the HN homepage https://news.ycombinator.com/item?id=47927903

    I've explored I different out-of-copyright trained model Mr Chatterbox before but found it to have been mildly corrupted through the help of synthetic conversation pairs from Haiku and GPT-4o-mini - https://simonwillison.net/2026/Mar/30/mr-chatterbox/

    Talkie isn't entirely pure either though: "Finally, we did another round of supervised fine-tuning, this time on rejection-sampled multi-turn synthetic chats between Claude Opus 4.6 and talkie, to smooth out persistent rough edges in its conversational abilities."

  54. 54. infotainment||context
    > Andrej Karpathy can train a GPT-2 class model for less than $80 now, so at least the environmental cost of training may drop to a point that it's acceptable to LLM vegetarians: https://twitter.com/karpathy/status/2017703360393318587

    I suspect that even if you reduced the cost of training or any other real world metric, the goalposts would immediately move. It seems to me that it has never been about those things, but simply about the feeling of superiority one can attain by eschewing something seen as trending.

  55. 55. WatchDog||context
    It's that but also the narcissistic injury caused by seeing an LLM practice the craft you have spent your life trying to perfect.
  56. 56. strange_quark||context
    I don't get why it's so hard for you and others in this comment section to understand why people hate AI so much because it's not just the theft and environmental destruction. A college professor, especially one at a liberal arts school, is obviously not going to like something that enables you to outsource your thinking and steals your agency. I think that's a perfectly valid viewpoint; maybe talk to someone without STEM-brain who lives outside of SF for once.
  57. 57. simonw||context
    I've recently been amplifying this excellent piece about that by Nilay Patel https://www.theverge.com/podcast/917029/software-brain-ai-ba...

    I don't need computer science professors to like LLMs, but I still want them to be able to poke at them with a stick without feeling like they are violating their principles regarding energy usage and unlicensed training data.

  58. 58. strange_quark||context
    > I don't need computer science professors to like LLMs, but I still want them to be able to poke at them with a stick without feeling like they are violating their principles regarding energy usage and unlicensed training data.

    Why? Language models are interesting from a technical perspective, but so are tons of areas of CS. There's nothing inherently virtuous about using an LLM.

  59. 59. simonw||context
    I think LLMs are the most fascinating new piece of computer science to come along in at least the past decade.

    The academic field of computer science pretty much started as an exploration into whether machines could be built that could understand human language.

    The Turing test dates back to Turing!

  60. 60. strange_quark||context
    > I think LLMs are the most fascinating new piece of computer science to come along in at least the past decade.

    Agree to disagree.

    > The academic field of computer science pretty much started as an exploration into whether machines could be built that could understand human language.

    No? CS started as an offshoot of applied mathematics and physics. The study of formal logic, algorithms, digital circuits, etc. predates Turing by centuries. Hell, even the Turing machine predates the Turing test by a couple decades.

  61. 61. tptacek||context
    Wait, really? Say more about the disagreement? That's interesting. Even LLM skeptics I've talked to are still shocked at how far transformers can get you.
  62. 62. nikcub||context
    * real programmers write assembly, not FORTRAN

    * real programmers manage memory, it's a craft

    * real programmers don't drag and drop

    * real programmers don't use intellisense

    * real programmers don't need stack overflow

    * real programmers don't tab-complete

    * real programmers don't need copilot

    * real programmers don't use llms <- you are here

  63. 63. 2ndorderthought||context
    That's also not what he is saying. I don't see how that is what everyone is taking from this.
  64. 64. sergiomattei||context
    > Have the courage to go slowly, especially when everyone else is telling you that you need to go fast and cut corners.

    I've been struggling to figure out what "slower" would look like when working in industry. If everyone's working 2x faster, how do you slow down meaningfully without getting axed?

  65. 65. Barrin92||context
    provided you have the financial freedom to, don't apply to jobs where this mentality is rewarded.

    After getting my CS degree I deliberately went into a sector where I suspected this kind of attitude doesn't exist (defense in my case) because already then I felt the whole web/startup culture had very little to do with software engineering.

  66. 66. bee_rider||context
    Produce something 3X as good, I guess, and have one of the handful of jobs where your boss can recognize that.
  67. 67. AnimalMuppet||context
    Slow can be fast.

    As I got older and more experienced, I didn't produce code faster. I just produced the right code. If you don't have to try five different things, and debug them along the way, you can be a lot faster without "going fast".

  68. 68. 2ndorderthought||context
    I've seen people work very quickly to create vaporware. I've seen people spend a week to change 2 lines of code and save a release. I don't know how people who practice engineering haven't seen these types of things happen.

    I've even seen a guy spend most of his work hours as a mentor even though his title was something like senior engineer. If anyone fired him that company would tank so fast...

  69. 69. hackable_sand||context
    Slow is smooth and smooth is fast
  70. 70. booleandilemma||context
    It doesn't matter what we think, what ethics we have, because if we won't do what the evil company asks for management will just find an H-1B slave from the third world who will.

    We need to discontinue the H-1B visa and have Americans programming again. Americans who are empowered to push back when management crosses an ethical line.

  71. 71. cdfalcon||context
    There's something so off-putting about academics giving industry advice when they haven't spent a day working as an engineer at a company.

    > Care deeply about your craft. Refactor code until it is clear and elegant. Write good documentation for other humans to read. Have the courage to go slowly, especially when everyone else is telling you that you need to go fast and cut corners.

    Outside of the bit on avoiding cutting corners, this advice seems like a straight path towards unemployment in a few years. The implication is that "your craft" is writing and polishing code, a skill which seems to be increasingly antiquated in favor of higher level system design. Who is going to read your carefully crafted documentation lol? The agents who replace you?

    If a tree falls in the forest...

  72. 72. danny_codes||context
    Perhaps your vantage point from industry is in fact myopic. We all have our own biases.
  73. 73. cdfalcon||context
    Completely fair - but at least my PoV comes from having actually worked as a SWE, you know? I feel like the best understanding this fellow can have is purely secondhand from watching the success / failures of his students.

    I also think I get doubly upset from advice like this because it’s given and marketed to impressionable young students. Even agreeing with all the moral points he’s made, I truly think this advice would set up a new grad for failure and have them focusing on the wrong skills for this market.

    The bit about ignoring trends feels too head in the sand for my liking :/

  74. 74. danny_codes||context
    Fads come and go in industry. This version of LLMs will come and go as well, as will the coding languages and paradigms we used before (and, presuming you want your code to actually run, still do with some decent frequency).

    Will LLMs in their current ergonomics have staying power? Perhaps. Nobody can predict the future. But I don’t think it’s a given in the least

  75. 75. ActivePattern||context
    Automatic coding systems have way too much economic value to be considered a "fad". I don't think you need to be Nostradamus to predict that we're never going back to manual coding. Sure, the systems will evolve and improve, but they're certainly not going anywhere.
  76. 76. slabity||context
    > Automatic coding systems have way too much economic value to be considered a "fad".

    Which is why they very carefully worded it more as 'LLMs in their current form', twice.

  77. 77. CamperBob2||context
    Yes, if you stake out an argument carefully enough, you can make its perimeter infinite and its area zero.
  78. 78. DJBunnies||context
    How do you know they didn't? My college professor was formerly at NASA, where this stuff is important.

    I recognize not everyone's work is [as] important, but we should still strive for excellence (and safety.)

  79. 79. cdfalcon||context
    One check of their LinkedIn.
  80. 80. gipp||context
    Buddy... The whole point of the post is that he wants his students to question whether "succeeding in this market" is really the right choice.
  81. 81. lukan||context
    The right choice is rather to strive for perfect - and be unemployed?

    To me it was actually not clear what his point was.

    "Above all, be motivated by love instead of fear."

    Sounds great. But not that practical.

  82. 82. fooqux||context
    Why isn't it practical? In my life, I've encountered many SWEs that have changed careers. I've met them in national parks working as rangers. In real estate, grocery store butchers, and yak ranchers. Yet I've never once encountered a SWE that was once doing something non-technical and decided to switch.

    Purely anecdotal, I know. But still, I prefer to think that all those people discovered this practical advice and are far happier for it. I've never met one that regretted their decision.

  83. 83. lukan||context
    Oh, I would consider becoming a park ranger as well, but as a european, I also did not had to go deep in dept, to become a SWE.

    And a professor should take that into account and give practical advice. In the real world, solving haskell challenes (of which the prof is fan of) is unfortunately not that useful. People have real needs for working software to solve their real pain points. Not to worship code quality.

    Some projects need obviously better code quality (airplanes, medical equipment..) - but not all of them. And if you want to have sacred code when coding a crude throw away app .. you won't get enough money for that. And positions for academics are limited.

  84. 84. dijksterhuis||context
    i was writing a bit of a lengthy reply, but yeah this is the whole point really.

    making that money, getting that job title, being at that company, working on that project -- are these success?

    or is success simply doing the best job possible when writing code?

  85. 85. beej71||context
    The irony is that writing the best code possible is now a recipe for unemployment.
  86. 86. 2ndorderthought||context
    It's really not though.

    The point is to decide what success is for yourself. Learn everything you can about the thing you might decide to automate. But think before you automate and how you do so because it could cause more harm then good.

  87. 87. microtherion||context
    When I started studying CS, the "industry" thought students should be taught COBOL, and maybe some PL/I and Fortran, because obviously that was what the market wanted.
  88. 88. archagon||context
    I worked at a FAANG in a senior role for around 6 years and I completely agree with the article. (I left before LLM/agent use became widespread, but I would have flamed out anyway if it was forced upon me.)
  89. 89. xantronix||context
    It's scary just how quickly the past has been buried: Decades of accumulated insight on best practices, all discarded in service of the new electric Christ.
  90. 90. CamperBob2||context
    The blacksmith's lament.
  91. 91. xtracto||context
    This hit very close home. I'm a 44 year old developer, with Software Engineering Bachellors and CompSci MPhil and PhD. All my life I spearheaded "best practices" and code quality (from Fred Brooks, Joel Sposky, Martin Fowler, etc...).

    But since LLMs arrived... things have become crazy. The layer of "obscurity" that permeates code writing seems to make a lot of those "standards" moot or just not really pragmatically possible to follow.

  92. 92. lo_zamoyski||context
    That's a flippant reply.

    Programming is a practical skill, and its most common expression is industrial or commercial, not academic proofs of concept. The post addresses students who will enter industry; that's the focus of the professor's own post.

    And I sympathize with many points being made here. However, the point of refactoring code is somewhat odd and detached from the real life constraints of programming in the wild.

    Like, sure, in the ivory tower, you can confine yourself to nicely bounded problems and tidy little toy POCs. You can survive doing those things, because the selective pressures allow for it. I love those things, personally. They help me understand the nature of the thing. And in an academic settings, you can refine and refactor the hell out of those things to your heart's content (not that there is necessarily an objective end point to refactoring; code organization is subject to goals and constraints which can shift around).

    But the reality of software in a commercial setting is not the tidy one you can expect in an academic setting. It's messy, subject to commercial pressures, to a hierarchy of values that doesn't place "refactoring" at the top of the list. And why would it? Whether you should refactor something is not just a question of whether it suits your conceptual tastes or even whether it is more maintainable. Unlike algorithms and principles and even techniques, software is not eternal. It is ephemeral. It's shelf-life is bounded. It is a piece of a larger business process. You're not refining some theory or some grasp of a Platonic ideal. You're mostly just putting into place plumbing to get something done. Whether you should refactor something, when you should refactor something, is a matter of prudential judgement, which is to say, of practical reason.

    So, in light of that, there are actually quite absurd things to say given the difference between the privilege of academia and the gritty reality of industrial and commercial software development. If we were to force our professor into the world of industry, he would quickly lose his job or he would quickly learn that some of his strange idealism is silly and detached from the reality that his students will face.

  93. 93. godelski||context

      > It's messy, subject to commercial pressures, to a hierarchy of values that doesn't place "refactoring" at the top of the list. And why would it?
    
    Probably because it's a good way to be more profitable.

    Code that's easier to understand is easier to: maintain, generate new features for, fix bugs, onboard new engineers, etc

    Code that's well written: executes faster (saving computational costs), scales better, has higher uptimes/more robust, reduces bandwidth, and so on.

    The thing is the business people will never understand this. Why would they? They're not programmers. They're not in the weeds. But that's what your job is as an engineer. To find all these invisible costs.

    I'm pretty confident the industry is spending billions unnecessary. Hell, I'm sure Google alone is wasting over $100m/yr due to this.

    Don't be penny wise and pound foolish. You're smarter than that. I know everyone here is smarter than that. So don't fall for the trap

  94. 94. lo_zamoyski||context
    Save us the patronizing tone.

    I am well aware of stupidity in industry. However, I am also wise enough to recognize the opposite error. (I myself have academic tendencies and a background aligned with that. I have chosen jobs that payed less, because the subject matter was more interesting for me. I'm not some vulgar, money-chasing techbro here.) The via media demands that we recognize the distinction between general truths and practical realities. As I wrote elsewhere in this thread, yes, properly refactored code is easier to maintain, easier to read, easier to change, and theoretically, commercially preferable. It also makes programming more satisfying, helping retention. But that describes a feature of such code. It doesn't tell us what the right course of action is in a particular situation. The notion that refactoring is unconditionally the right course of action when code is not in some ideal state is simply wrong. It really does depend on the situation. Sometimes, refactoring is the wrong thing to do.

    I'm not making some outrageous claim here. This follows from basic truths about the nature of what it means to be practical, and if industry is anything, it is practical.

  95. 95. foltik||context
    The professor is obviously not advising naive absolutism. He’s saying care deeply about your craft, and good judgement will follow from that.

    Actually caring is what gives someone the itch to go back and improve things, versus happily calling it a day once minimum acceptable value has been delivered. The rampant enshittification of basically everything should make it clear which disposition is in short supply.

    > Have the courage to go slowly, especially when everyone else is telling you that you need to go fast and cut corners.

    The advice is aimed at students who haven’t yet decided which type they want to be. In fact it’s directly telling them to think for themselves and not blindly listen to you or anyone else here making the same case.

  96. 96. godelski||context

      > Save us the patronizing tone.
    
    If you come out swinging you can't get mad when others swing back. You're not a victim, you're an instigator. You called danny_codes flippant for suggesting there are different biases. You called it absurd. You escalated it. And then you escalated it again.

      > It doesn't tell us what the right course of action is in a particular situation.
    
    That's because there is never an objectively correct course of action. There is no optimal solution. In fact, there can't be when the situation evolves. The objective isn't even defined, let alone well defined. I don't understand your point because no one was suggesting it was always the right answer. Don't strawman here. Of course it depends on the situation, that's true about almost everything. It doesn't need to be said explicitly because it's so well understood. Don't inject absolute qualifiers into statements that don't have them.

      > I'm not making some outrageous claim here.
    
    Your current claim? No. To be frank, you didn't claim much. But your prior claim? Yes. Yes you were. You were creating strawman then just as you did now.

      >> Unlike algorithms and principles and even techniques, software is not eternal. 
    
    Not even algorithms are eternal. But I'm going to assume you're meaning the types of algoritms you see in textbooks because interpreting "algorithms" by its actual definition makes your comment weird. Since all programs are algorithms.

      >> [Software] is ephemeral. It's shelf-life is bounded.
    
    And this is going to be something nearly everyone is already going to assume. It doesn't need to be stated. It doesn't need to be differentiated because it is already the working assumption.

      >> You're not refining some theory or some grasp of a Platonic ideal
    
    And this is the real strawman. You're made a wild assumption about what others are claiming. There is such a wide range of viewpoints between "the way things are done now" and "chasing perfection." Anyone that thinks perfection exists in code is incredibly naive. You and I both know this, and so does anyone working in industry or academia (save maybe some juniors). There's a huge difference between saying "this isn't good enough" and "it's not good until its perfect." If someone talks about climbing a mountain you can't respond by saying it is impossible to climb to the moon.

      >> Whether you should refactor something, when you should refactor something, is a matter of prudential judgement, which is to say, of practical reason.
    
    Whether you should do anything is a matter of prudential judgement. It's wild to say this while accusing people of chasing perfection. You think people are just yoloing their way to perfection?! Seriously? The article and thread context is literally asking that people use more prudential judgement. To not be myopic. And you have the audacity to say "think about it". What do you think we're doing here?
  97. 97. csmantle||context
    The industry's goal is to ship fast and profitably. A learner's goal isn't.
  98. 98. cdfalcon||context
    Oh that’s such a high horse position lol - I try and learn as much as possible every day by shipping fast and profitably. Learning to be successful in industry is a completely valid (and common) goal.
  99. 99. throwaway81348||context
    >I do not and will not use LLMs, in any form, for any purpose.
  100. 100. flockonus||context
    How this will age:

    >I do not and will not use the internet, in any form, for any purpose.

  101. 101. andyfilms1||context
    Oh no! Anyway
  102. 102. sambapa||context
    I mean... Right now it sounds pretty good?
  103. 103. 2ndorderthought||context
    As an educator there is nothing wrong with that.
  104. 104. lo_zamoyski||context
    Education is distinct from industry. The point of education is understanding and knowledge. The point of industry is practical effect and production. The aims are not the same.

    And you can understand the principles governing something without knowing all the concrete particulars of an instantiation. In fact, you rarely do.

  105. 105. 2ndorderthought||context
    I know what you are saying. But, almost every major issue I've run into with various teams writing software in production required knowledge of all those particulars to fix.

    I also believe learning the basics is essential before reviewing someone else's work. Whether that work is done by a human or machine.

  106. 106. CamperBob2||context
    Just don't try to fool anyone into thinking you're preparing your students for the workforce they'll be entering.
  107. 107. thundergolfer||context
    Completely agree that it's off-putting. The author indeed has only ever worked in academia per his LinkedIn.

    But disagree that this is a path to unemployment. At work we go very fast and yet I think fast is compatible with each of those points, just not in all situations.

    Marc Brooker, distinguished eng at AWS, gives much more useful advice for industry, as you'd expect given his almost 30 years in industry.

    https://brooker.co.za/blog/2026/03/25/ic-junior.html

  108. 108. rowanG077||context
    From that guys LinkedIn he was in academia and then at AWS. I guess it's better than the professor but hardly someone who knows the ins and outs of the industry. For that you need someone who has had a multitude of jobs at various different types of organizations.
  109. 109. uhhhd||context
    This. Exactly this. You'll be unemployed. He'll still have tenure.
  110. 110. sosodev||context
    I sense that the frustration you feel is that professors are able to make choices based on their values, but the average person is not. That is broadly speaking, of course.

    I think it is a great shame that we live in a modern world where we do we must to survive regardless of how it makes us feel. I suspect it is the root of much suffering.

  111. 111. ryandrake||context
    Seriously. This thread is so depressing. It's like the entire software industry has given up and just accepted "increase speed forever at any cost" as some kind of iron law of software employment. Is nobody even pushing back anymore? Even offering token resistance? The 'bros have truly won. Our only imperative now is "Can we crush it in the market?"
  112. 112. g-b-r||context
    The parent comment (cdfalcon) has 41 votes right now, it's disgusting
  113. 113. CamperBob2||context
    How do you know how many votes another user's comment has?
  114. 114. foltik||context
    Wild how many people take “care about your craft” as a condescending personal insult. Maybe it’s hard to hear once the job’s beaten it out of you. And it’s about to get a lot worse.
  115. 115. saadn92||context
    What gets me is the craft point. I've shipped more useful software in the last year than probably the previous five combined, and most of that is because I stopped treating code as the artifact and started treating the product as the artifact. The craft moved up a layer.

    > until it is clear and elegant

    New grads who spend weeks refactoring code are going to get lapped by new grads who ship something and iterate. There's just a faster feedback loop now.

  116. 116. 2ndorderthought||context
    This person is an educator. You should absolutely learn how to code by deep practice. You can easily learn how to use the slop machine in I don't know a week or something if the job demands it.
  117. 117. minihoster||context
    So now we're downvoting the idea that people should have a strong understanding of how to code? We're cooked. A week does seem about right for getting to 90% of optimal AI agent use if you earnestly explore its boundaries.
  118. 118. smolgumball||context
    Absolutely wild to see this take downvoted. While it's abundantly clear that Hacker News has long since become a mouthpiece for the AI investment machine, I really hadn't felt the loss of strong engineering ethos until recently.
  119. 119. mwigdahl||context
    I didn’t downvote, but if I were to it would be due to the dismissive phrase “slop machine” rather than the message, which I agree with.
  120. 120. hhjinks||context
    The slop machine is stupidly easy to use. Recently switched jobs and got to use Claude Code for the first time. Literally just talk to it. There's nothing to learn.