NewsLab
Apr 28 19:09 UTC

Google and Pentagon reportedly agree on deal for 'any lawful' use of AI (theverge.com)

202 points|by granzymes||200 comments|Read full story on theverge.com

Comments (200)

120 shown|More comments
  1. 1. morkalork||context
    Will lawful use be determined in secret courts a la NSA and FISA?
  2. 2. Sanzig||context
    Doubtful it will even get that far, the DoJ will simply draft an appropriate fig leaf memo with a predetermined conclusion and the government will simply plow on ahead.

    https://en.wikipedia.org/wiki/Torture_Memos

  3. 3. stephbook||context
    They simply say they have that memo. Who knows whether they even drafted it for real? And if anyone starts looking, Gemini can quickly draft one itself. Nice!
  4. 4. vrganj||context
    Don't be silly.

    "When the president does it, that means that it is not illegal." - Richard Nixon

  5. 5. kentm||context
    Also the Supreme Court, half of Congress, and apparently something like 40% of the American populace.
  6. 6. ceejayoz||context
    Who defines "lawful" if Google and the Pentagon disagree?

    > The classified deal apparently doesn’t allow Google to veto how the government will use its AI models.

    Seems concerning?

  7. 7. belzebub||context
    There's big air quotes energy in their statement
  8. 8. tdb7893||context
    Especially concerning with the how creative the executive branch can be when it comes to what laws mean. With little oversight, it seems guaranteed that it will be used for unlawful activities (despite whatever tortured argument some lawyer will have put into a memo somewhere).
  9. 9. xp84||context
    Yeah, they’re really bad! Seems like it might be time to try convincing people to vote for someone else! Democrats haven’t tried that play since 2012, preferring the “scorn and insult anyone outside your base” strategy that’s worked so well since.
  10. 10. f33d5173||context
    Lawful is presumably defined in the usual, common sense, ie we can do whatever the f we want until a court physically forces us not to.
  11. 11. dmd||context
    And since the court has no way to physically force anything - that's the executive branch's function, (it's right there in the name) - lawful has no meaning whatsoever if it's the executive branch that wants to break the law.
  12. 12. muvlon||context
    And the Pentagon has historically gotten away with damn near everything even in the judicial branch by appealing to national security.
  13. 13. shevy-java||context
    It kind of reminds me of a mix of Skynet in Terminator and Minority Report. But nowhere near as interesting. More annoying than anything else.

    I am kind of mad at James Cameron here. Skynet was evil but interesting. Reallife controlled by Google is evil but not interesting - it is flat out annoying.

  14. 14. ApolloFortyNine||context
    This has to be one of the strangest "debates" in history.

    Congress and the courts obviously.

    If you think there's a hole in the law tell your congressman, don't, for some reason, try and put Google or any Ai company above the government.

  15. 15. ceejayoz||context
    > Congress and the courts obviously.

    The first is fully neutered. The second is far too slow.

    "Nothing unlawful" needing to be in the contract is inherently concerning, as it's typically the default, assumed state of such a thing.

  16. 16. deepsun||context
    "follow the law" in contracts IMO is there to be able to claim a "breach of contract" by one party.
  17. 17. calgoo||context
    Please! That ship sailed a long time ago. Sure tell your congressman, who is most likely bribed (lobbying is bribing, lets use the real words) by the same companies to accept the deal. The courts can try, but who is going to enforce it when the people above says that its fine.
  18. 18. CobrastanJorji||context
    That's presumably the trick, and it's not a subtle one; it's why the article puts it on quotes in the headline. Google gets to claim that it stood up for principles because it boldly insisted that the government obey the law, and the government will claim that whatever it decides to do is lawful. It's the same as what OpenAI did except not handled buffoonishly.
  19. 19. ethagnawl||context
    The classified aspect is probably the most concerning. How can I write my representative (and expect a form letter response six weeks later) if I don't know what I'm objecting to or even if I should be objecting?
  20. 20. cooper_ganglia||context
    Why would you write a letter if you don't know what you're objecting to or even if you should be objecting?
  21. 21. ceejayoz||context
    Can't I object to not knowing?
  22. 22. cooper_ganglia||context
    No, that's what classified means.
  23. 23. ceejayoz||context
    Surely I can complain about overclassification of things that should not be classified?
  24. 24. xp84||context
    Absolutely. We will file your complaint in the appropriate location.

    The location is classified.

    Ok all jokes aside, if you suspect that there’s wrongdoing in the classified sphere, and it really matters to you, well, you should get involved in politics. We don’t just let everyone everywhere know everything, because we think it would be risky if Putin or the Chinese Communist Party also knew all those things. So we limit it to people who have taken oaths and are accountable and need to know (the military), the civilians who need to know (security clearance holders), and those who hold a high office with the public’s trust (high-ranking politicians). You can be a Senator. You just need a lot of people to trust you enough to vote for you. Or, and this is a bit easier, support politicians you do trust to vet classified things to be elected to high office, and ask them to look into it and give you their word that things are being done properly.

  25. 25. ethagnawl||context
    That's kind of my point? I'm concerned by what has been made available but can't form a complete opinion and decide if I need to take action without knowing the full extent of the agreement.
  26. 26. cooper_ganglia||context
    Nor should you be burdened with that.

    This is why we elect competent (hopefully) leaders to worry about these things for us. Mob rule democracy about every national secret would mean they’re not secrets for very long!

  27. 27. impulser_||context
    No it doesn't at all. Private corporations shouldn't be telling the government what it can and can't do. That's the job of the people. You want private corporation overriding your vote?
  28. 28. ceejayoz||context
    > Private corporations shouldn't be telling the government what it can and can't do.

    So Google can't tell the government it needs a warrant to perform a search? Google can't sue over something the government did?

    It's Google's product they want to buy.

  29. 29. serial_dev||context
    Just follow the orders, man!
  30. 30. red-iron-pine||context
    don't worry about the people getting sent to camps. it's lawful so it's okay.

    now follow orders.

  31. 31. impulser_||context
    I'm talking about lawful, like it written in the terms.
  32. 32. ceejayoz||context
    But Google isn't, apparently, permitted to object "that's not lawful".

    And again, it's Google's product. Why can't they set conditions? If I pay Google to host my email, I'm still subject to their policies.

  33. 33. xp84||context
    Agree. It seems on the surface convenient right now when people think the company (or rank and file employees?) are on their political “team” but they’d get less comfortable when oil companies or other “bad” companies dictate terms to the government. “We’ll provide fuel for the military if and only if you overturn the leader of $COUNTRY”

    (Yes, I recognize that past military entanglements do read as favors for Big Oil, but that’s more because lobbyists directly purchased the corrupt and useless Congress)

  34. 34. ceejayoz||context
    > “We’ll provide fuel for the military if and only if you overturn the leader of $COUNTRY”

    A mechanism to address this exists, though.

    https://en.wikipedia.org/wiki/Defense_Production_Act_of_1950

  35. 35. yibg||context
    Of course it can. Terms of service and contractual obligation (should) apply to governments as well. Google is perfectly capable of outlining what's acceptable use and what's not, and the government is free to accept or reject and not use the product. Google is choosing not to set the boundaries.
  36. 36. cooper_ganglia||context
    Google should never be determining what is lawful or not.
  37. 37. kingleopold||context
    "who watches watchmen"

    question as old as time itself

  38. 38. jonathanstrange||context
    One thing is sure, they don't have international law in mind...
  39. 39. dismalaf||context
    By definition "the law" is the set of laws that the government passes. So it's a roundabout way of saying the government can pretty much do what they want.

    Also, this is probably the only acceptable arrangement when it comes to industry-government contracts. The government will always have more information than civilians.

  40. 40. jcgrillo||context
    It's pretty funny how these guys are all becoming some kind of internet version of, like, Halliburton. It seems pretty desperate. B2C and B2B applications didn't pan out I guess?
  41. 41. zarzavat||context
    It's one of two identified uses for AI that is profitable today: writing code and blowing up schools. They are desperate to show the market that the technology is anything more than a money pit.
  42. 42. ctoth||context
    The thing is we're in a new Cold War, and most of our adversaries have gotten the memo and most of us ... haven't. Yes, becoming a new Halliburton is a rational move if you see the board right now. I don't like it even one tiny bit.
  43. 43. a456463||context
    I don't like it even tiny bit. But other people are doing it, so I mma go full steam ahead.

    This is exactly what got us here.

  44. 44. tombert||context
    When my sister and I would play monopoly as kids, we had lost the manual so whenever we didn’t like the outcome of whatever happened, we would make up rules about what was right. Technically then, it was very easy stay compliant while still being able to do well because we could rewrite the rules.

    Also, since I was older I feel like I was able to get away with those redefinitions a lot more often…

  45. 45. cucumber3732842||context
    The big reason it's "obvious" when tech megacorps do it is because big tech is new to the game and doesn't have an existing regulatory capture system already up and running and legitimized like medical, civil engineering, energy, agriculture, chemical, etc, do.

    If this were 3M making nasty stuff for Northrop to put in bombs and drop on brown people or Exxon scheming up something bad in Alaska or bulldozing a national park for solar panels or some other legacy BigCo doing slimy things that are in the interests of them and the government but against the interest of the public they'd have 40yr of preexisting trade group publications, bought and paid for academic and media chatter, etc, etc, that they could point to and say "look, this is fine because the stuff we paid into in advance to legitimize these sorts of things as they come up says it is" though obviously they'd use very different words.

  46. 46. SecretDreams||context
    > If this were 3M making nasty stuff for Northrop to put in bombs and drop on brown people or Exxon scheming up something bad in Alaska or bulldozing a national park for solar panels or some other legacy BigCo doing slimy things that are in the interests of them and the government but against the interest of the public they'd have 40yr of preexisting trade group publications, bought and paid for academic and media chatter, etc, etc, they could point to and say "look, this is fine because the stuff we paid into in advance to legitimize these sorts of things as they come up says it is" though obviously they'd use very different words.

    My friend, this paragraph needed some periods. I could not follow what you were trying to say - but it seemed interesting enough to consider retyping.

  47. 47. Vachyas||context
    Good comment, and I agree lol

    I read it twice (admittedly quickly) but couldn't grasp the point even though I felt like it was there.

  48. 48. fwip||context
    It's not really hard to read.

    If this were a traditionally evil company, the work to legalize the evil things would have started forty years ago.

  49. 49. GeekyBear||context
    > big tech is new to the game and doesn't have an existing regulatory capture system already up and running

    The career officials in the Obama FTC started proceedings for an antitrust lawsuit against Google over a decade ago.

    The political appointees (of both parties) shut it down.

    It seems to me that regulatory capture has been working for Google for some time now.

  50. 50. tombert||context
    I mean it's basically an extremely high-stakes version of the (possibly apocryphal) Upton Sinclair quote: "It is difficult to get a man to understand something, when his salary depends upon his not understanding it."

    Most people (at least the people I've talked to, which admittedly is somewhat of a lefty bubble but I think even more generally) agree that companies getting to or close to "monopoly" status is a pretty bad thing, and that they should be broken up. Political candidates get a lot of social credit for claiming that they're going to do exactly that. The moment that they actually get into a position where they actually could do something about it, they suddenly remember who their campaign contributors are, and can then create reasons to avoid actually solving any of these problems.

    Very occasionally we have successes in this field, like the breakup of Standard Oil and AT&T), but of course both of these sort of became toothless since we basically allowed both of these companies to re-acquire each other and form the same problems again.

    There are similar reasons as to why politicians will occasionally push for regulations to not allow themselves to invest in companies that their policies affect, but somehow manages to never get through.

    Politicians are very rarely punished for breaking political promises, but often rewarded for making the promises. They are also rewarded by their corporate overlords for breaking these promises.

  51. 51. Aerroon||context
    >they suddenly remember who their campaign contributors are, and can then create reasons to avoid actually solving any of these problems.

    There are very real concerns when you break up a company though. Rockefeller's wealth shot up a lot when Standard Oil was broken up. That could easily make a politician that's "politician out to get the big companies" into "politician making billionaires richer."

  52. 52. tombert||context
    Tough to say for sure, but I think it's probably still better to have more billionaires if there's more competition.

    I wasn't around during the breakup, but my parents told me that phone service got considerably better and cheaper after the AT&T breakup, which makes enough sense to me: if a consumer can drop you for someone else, you have a reason to try and compete on service and/or price.

  53. 53. WarmWash||context
    Google has a monopoly because of the internet's insistence on ad blocking, and outright indignant refusal to dare pay a greedy company for thinking they could ask for money for a "free" web service.

    It's basically impossible to get off the ground competing against google when 30-40% of people are just freeloading your service, and 80-90% think the internet is an ethereal realm that everyone could have ad and subscription access to if we could only agree to starve these greedy middle men.

  54. 54. tombert||context
    I've heard dozens of people say this (and I've even said it myself) but I don't think it actually holds water. People will pay for things if those things don't suck, and it's not even hard to find examples of that (even with Google products no less!).

    For search, Kagi has had a growing fanbase for a couple years now, but let's take things that have been easy to get for free for decades: Movies.

    People have been, with relatively impunity, able to torrent movies for free for a very long time. It's not hard, and the only way you're paying for it is ads for hot MILFs in your area. And yet, despite this having always been an option, somehow Netflix and Hulu and Disney+ and HBO Max have managed to make fairly successful businesses selling movies that could have been pirated.

    I could get YouTube as ad-free with an ad blocker, but I pay for YouTube Premium. I could get all my music for free with Redacted, but I use YouTube music, or I buy CDs. I could torrent video games but I just buy them off Steam or GOG.

    This isn't new either; there were thousands of free forums on the internet in the late 90's, but yet people still bought accounts on Something Awful for quite awhile (and indeed still buy accounts, but with much lower numbers).

    We can certainly argue about how much value these companies are providing, and we can argue about how it's annoying how there's a million different streaming services now and how that's really irritating, but my point stands: people do pay for things on the internet.

    We don't have to accept that companies need to sell all our data. We don't have to accept being bombarded with ads. We don't have to accept that people won't pay to use services.

  55. 55. smallmancontrov||context
    The word "lawful" always seems to get dragged out when people in power are doing some especially heinous rulemaking, like throwing a hissy fit over a single company trying to voluntarily draw a line at domestic surveillance and fully automated killchains.
  56. 56. bko||context
    A private corporation can choose not to sell to the government. A lot of them do exactly this. A lot of hoops to jump through.

    However, if they do sell to the government, they shouldn't have some sneaky way to exert control over decision making using their products. We're a country of laws, and for better or for worse, these laws are made by elected officials and those appointed by elected officials.

    Why an American company wouldn't want American defense to have the most capable tools at their disposal is a different matter all together, but here we are.

  57. 57. joshuamorton||context
    > they shouldn't have some sneaky way to exert control over decision making using their products.

    why not, many companies have all sorts of rules you agree to when using their products, including many legal ("lawful") things. Are you saying that the government as a client should be unbound by contractual obligations that apply to other clients?

  58. 58. throwup238||context
    Governments negotiate their own contracts with their own terms of service. That’s one of the hoops government contractors jump through.
  59. 59. tombert||context
    This administration has made it very clear that they will do what they can to change laws whenever convenient, without congressional oversight, whether or not they are "allowed" to.

    Trump implemented tariffs he wasn't allowed to immediately, he started a war he probably wasn't allowed to in order to (allegedly) distract from associating with a pedophile, he wrote an executive order trying to undo the fourteenth amendment, he has actively been abducting and imprisoning lawful residents (and even citizens!) and actively pushed for racial profiling to do so.

    If a company feels like the government will simply rewrite the laws in order to advance any kind of political whim (including to be weaponized against that very company!), it's not wrong or even weird for them to want to add safeguards to their product.

    To be clear, this isn't weird or uncommon. Lots the stuff you sign in the EULA isn't preventing you from doing things that are "illegal".

  60. 60. WarmWash||context
    Anthropic wanted the ability to verify compliance whereas OAI and Google are fine with "trust us". Which is how it always is, and always has been.

    For better or worse, the government is the one who audits, and has it's own internal systems for self audits. So no one except them tells them what they can or cannot do. The government would never put itself in a position where civilians died because Amodei didn't like the vibe of the case being worked.

    In a way it's wild that people are upset that the government didn't put a billionaire megacorp CEO in the drivers seat of intelligence.

  61. 61. bko||context
    I'd prefer our elected officials own the manual, accepting the fact that [person I don't like] could be in power and they can re-write the rules, then a private billion dollar corporation. Especially when it comes to defense.
  62. 62. mc32||context
    Ha! If the Congress did diddly squat about eavesdropping on them by organizations that aren’t supposed to spy on citizens back in the Obama days (we also spied on allies’s governments but that’s kinda what all of them do) there is no hope in them reining things back at all… for mere hoi polloi.
  63. 63. bko||context
    I guess we have to appoint Amodei and Altman as our benevolent dictators to keep Congress in check!
  64. 64. hgoel||context
    How well does this hold up in terms of legal scrutiny when previous actions indicate that the Pentagon would retaliate against Google if they didn't accept this "lawful use only" farce?

    Could Google back out of this agreement later by arguing that they were coerced?

    Not trying to suggest that Google would be opposed to doing evil, but curious about how solid this agreement would be in practice.

  65. 65. john_strinlai||context
    there is 0 reason that the definitions of 'lawful' for the purposes of these agreements should be classified.
  66. 66. svachalek||context
    There's a reason, you just won't like it.
  67. 67. mullingitover||context
    Reminder that this administration has some absolute howler theories about what constitutes lawful behavior[1].

    [1] https://www.nytimes.com/2025/09/20/us/politics/tom-homan-fbi...

  68. 68. shevy-java||context
    The beginning of Skynet 6.0.
  69. 69. qznc||context
    And that is news-worthy because unlawful use is normal?
  70. 70. ChrisArchitect||context
  71. 71. ChrisArchitect||context
  72. 72. Brian_K_White||context
    What a handy word "lawful".
  73. 73. Imnimo||context
    Unsurprising from Google, but still bad. If Google has no right to object to a particular use, this is equivalent in practice to "any use, lawful or not".
  74. 74. anygivnthursday||context
    Is Iran already a vibe war or those are just coming?
  75. 75. anematode||context
    Who could have seen this one coming. From yesterday: https://www.cbsnews.com/news/google-ai-pentagon-classified-u... ("Hundreds of Google workers urge CEO to refuse classified AI work with Pentagon").

    Any AI researcher who continues to work here is morally compromised.

  76. 76. devin||context
    That's what the 7 figure salaries are for.
  77. 77. testfrequency||context
    It’s funny to me how many progressive people I know and am friends with who work at these AI companies which are marginalized demographics (Trans, Gay, Latino, Black).

    Still have faded Bernie stickers on their cars, No Kings organizers, “fuck SF I’m in the east bay for life fuck tech” - and you all make 7 figures Monday - Friday by supporting the death of society and democracy.

    I don’t dare say anything though because “money is money”, the bay is expensive..but I do sure as shit judge every single person I know who joined OAI, Anthropic, Google, and Meta.

  78. 78. site-packages1||context
    I would suggest looking inwards if this is how you really feel.
  79. 79. gambiting||context
    I'm curious what is that you're suggesting, exactly.
  80. 80. site-packages1||context
    I made another comment above. People contain multitudes. Different contexts, different choices, not everyone is in a box defined by the viewer's world view. You can't really know what's going on with someone else, in their heads, in their context, so give them some grace. Instead, this person's "friends" are "hypocrites" who were "lured" into their choices. It's very condescending. I am suggesting the poster re-examine their own views on other people in light of this.
  81. 81. foltik||context
    You're missing the point. They're just lamenting the contrast between what their friends say (fuck tech, no kings) and what they spend their workweek in service of.

    It's not complicated: if these friends would take a non-society-destroying job at equal pay (who wouldn't?) then their values aren't driving the decision, money is. Fine, that's a choice adults get to make. But then own it and actually justify it on its merits, don't just retreat to "who are you to judge."

  82. 82. senordevnyc||context
    Not everyone sees AI as "society-destroying".
  83. 83. testfrequency||context
    I mean no harm in saying what I said, I love my friends. I just can’t stomach the hypocrisy, it’s what the companies are preying and feeding off of.

    My friends are incredibly bright and good at what they do, it’s why they all have the roles they have. It makes me sad (and frustrated) knowing they are lured in by enough money dangling in front of them that makes them swallow their souls and identity, while fuelling the fire in the same breath.

    I have a deep amount of respect and gratitude for my friends (and anyone else) who chooses to work at non-profits, and more ethical - mission based companies for less. I hate how much these AI companies and roles are offering people, it’s completely forced lots of gifted people into a war machine.

  84. 84. site-packages1||context
    Do you suspect there is any chance they are fully independent adult human beings with full agency, who have looked at the pros and cons, and chosen to make the choices they did with clear eyes? Do you think there's any context that might square their choices with their own internal principles that don't make them hypocrites? I mean these as real questions. For "friends you love" you really seem to take a dim view of their intelligence.
  85. 85. testfrequency||context
    I’ll be honest and say it’s made me question and reposition some of my friendships with a number of these friends. Some joined well before we knew the fallout of how AI has affected and impacted society negatively, some have joined in recent years because they were offered 2x their currently already high comp package, and others will take any job they can get (who, admittedly, I judge far less as I know they are just needing to survive in a HCOL city).

    My dim view is more on the AI companies being absurdly overvalued, with too much money to know what to do, which feeds downwards into compensation packages, which lure in “innocent” individuals who can’t say no. It’s not been a healthy market to be vulnerable in, most companies outside AI are just not getting the same funding or can compete at all - and it’s a shit storm.

  86. 86. somenameforme||context
    One of humanity's greatest weaknesses is cognitive dissonance. People can convince themselves of just about anything. And in some ways intelligence is a burden here. A fool will just do something with a reason of 'f you, that's why.' It's only the clever man that will even bother rationalizing the villain into the hero, and we're great at it. An interesting thought experiment is to ask people if they'd be willing to push a button that would randomly kill a person somewhere in the world for a million dollars. They'd have no direct accountability themselves and their action would be unknown to anybody else.

    People will rationalize themselves into declaring this moral even though it is obviously one of the most overtly amoral actions possible. One friend I have, a rather intelligent guy otherwise, was even trying to create a utilitarian argument that he'd donate some percent of his 'earnings' to life saving charities meaning he'd be saving more life on the net. The fact that if everybody thought and behaved the same way, the entirety of humanity would cease to exist, was a consideration he didn't have a response for. Let alone the fact that he just rationalized his way into justifying near to any deed imaginable, so long as you got paid enough for it.

  87. 87. beernet||context
    Agreed. Just shows that big money doesn't dilude small character.
  88. 88. foobar_______||context
    Preach. The hypocrisy is startling. I think people started at these companies maybe years ago with "good intentions" and are willing to turn a blind eye. But now, given just how glaring clear it is, I don't think it is really excusable anymore. To be clear, people can work wherever they want including these companies but what kills me is the hypocrisy. They are pathological liars to themselves if they somehow think they aren't complicit.
  89. 89. tjwebbnorfolk||context
    Why is it morally compromising to work with the military of the country you live in?
  90. 90. plaidthunder||context
    I'm not anti-military as a rule but... c'mon. Opinions on the US military vary.

    In extremis, were the people working for Pol Pot just good patriots with no moral culpability?

    We could surely at least agree that there are cases where working for the military of your home country doesn't fully excuse you from your actions.

    In fact, I think international tribunals have existed which operated on just those principles.

  91. 91. mrexcess||context
    We can all agree that working for the Nazi government’s military would be morally compromising, right?

    You propose that other governments militaries would not be so compromising. Seems reasonable.

    But the question then becomes, what is the operative distinction between the two?

  92. 92. orochimaaru||context
    Why is it morally wrong for a US citizen to work with their government?
  93. 93. _vertigo||context
    It’s not morally wrong per-se but just because you are working with your government does not mean what you’re doing is necessarily moral
  94. 94. cooper_ganglia||context
    Just because you are working with your government does not mean what you’re doing is necessarily immoral, either.
  95. 95. _alternator_||context
    Correct. It depends. For example, it might depend on what the collaboration is likely to result in. Perhaps it would be more likely to be moral there were some boundaries in place, like "no mass domestic surveillance" or "no fully autonomous weapons".

    Because the US government currently believes it is legal to blow up civilian drug traffickers and wage war without congressional approval. So at some point, yes, collaboration is immoral.

  96. 96. nradov||context
    The US military has deployed fully autonomous weapons since at least 1979, and potential adversaries are now doing the same. For better or worse that ship has sailed.
  97. 97. Forgeties79||context
    So we are wrong to express any opposition or desire to maybe raise the bar here? Aren’t we supposed to be “the good guys”? Or should we just accept a role as the menace of the world, wildly throwing its weight around whenever we have an unscrupulous president?
  98. 98. nradov||context
    Those questions are moot. There are situations where it's simply impossible to have a human in the loop because reaction time is too slow or the environment is too dangerous or communication links are unreliable. Russia is deploying fully autonomous weapons to attack Ukraine today and they will be selling those weapons (or licensing the technology) to their allies. There is no option to stop. And let's please not have any nonsense suggestions that we can somehow convince Russia / China / Iran / North Korea to sign a binding, enforceable treaty banning such weapons: that's never going to happen.
  99. 99. t-3||context
    There's always an option to stop. We can choose civility over barbarity, stop trying to kill people over 1000+ year old dick waving contests, and stop threatening each other with doomsday weapons because your grandpa shot my grandpa. Just because our leaders are too stupid and cowardly doesn't mean there's no option.
  100. 100. nradov||context
    Sounds good! Please convince Vladimir Putin to choose civility over barbarity, then get back to us so we can discuss options.
  101. 101. Forgeties79||context
    We aren’t Russians and Putin is not our leader. We can choose how we behave and operate. This is like saying we should use chemical weapons if someone else deploys one. You’re speaking as if it’s all so binary. “Do what they do or you lose.”
  102. 102. _alternator_||context
    Look, a dumb bomb is a fully autonomous weapon once it's launched. Let's be real: an LLM making decisions on who to target and when and where to launch munitions represents a meaningful change in our concept of autonomous weapons.
  103. 103. Forgeties79||context
    Who said otherwise? Clearly it’s about facilitating specific acts by the government. Why are y’all acting like it was so wildly broad? No one said “working with the government is inherently immoral.”
  104. 104. cooper_ganglia||context
    Literally the parent comment:

    >Any AI researcher who continues to work here is morally compromised.

  105. 105. Forgeties79||context
    …doing this kind of work with the federal government. That is clearly what they are saying. You stripped all context from the discussion.

    You’re looking for the least defensible, worse interpretation of their comment.

  106. 106. cooper_ganglia||context
    No. Their comment was: “Any AI researcher who continues to work here is morally compromised.”

    But, “…doing this kind of work with the federal government.” is added context that was not there and is based on your own interpretation.

    The language of the parent comment charges that simply working at a company that is engaging in this makes one complicit in an immoral act, and the complicity itself is immoral. I disagree with all of that.

  107. 107. Jtarii||context
    Hegseth bombed a girls school in Iran last month. I think it's fair to doubt the moral worth of anyone assisting this admin.
  108. 108. conartist6||context
    It's ok, they weren't Christian girls, so of course they're in hell now. ...where Pete will go!

    Hey, I think I'm starting to get how this organized religion thing works. Maybe I'll join a few to make sure I go to allllll the good places

  109. 109. xp84||context
    Is it your position that any collateral damage in war is unacceptable and makes the one who caused that harm forever evil? Or that the whole world should adopt pacifism so that war is no longer practiced at all.

    If the former, this places a huge incentive on dictatorships like Iran to use the very easy strategy of co-locating all military targets with schools, hospitals, etc. so that any attack on them by anyone is automatically immoral.

    I don’t automatically think everything the US has done (either in Iran this year or in history) is good, best, righteous btw. But positions like yours seem to take for granted that it’s never okay to wage any kind of war.

    Set aside for a moment whether it’s safe to classify the Islamic Republic as a truly evil regime.

    I don’t want to tempt Godwin’s Law, but after seeing how the Left in the US and Europe rallied to the cause of supporting Hamas, I don’t think modern-day “progressives” have the courage to do anything to counter truly bad actors besides to ask them nicely to stop. I’d love to see someone from that political alignment explain where their red lines are, past which they’d morally support a military attack - and yes, even one where we can be nearly certain innocents will also be hurt or killled.

  110. 110. somenameforme||context
    I don't think that was intentional, but invading countries while trying to distract them with negotiations, randomly assassinating leaders and hoping everything just turns out well, threatening to "destroy civilizations", targeting bridges and more, all while aiding and abetting Israel which is intentionally destroying pharmaceutical, educational, and other such civilian institutions is all 100% intentional.

    In some ways worse than bombing the school was the effort to implicitly deny it. The school was near a military facility, and itself was a military facility in the past. US intelligence screwed up. They should have simply acknowledged what happened and why. Their response just reeked of cowardice and malice at the highest level.

  111. 111. t-3||context
    In a logical or mathematical sense, sure, but when it's the US government and a huge surveillance-tech company it's pretty necessarily immoral (at least in an American context where harming liberty is immoral - other cultures disagree).
  112. 112. mattnewton||context
    Idk about morality, but it’s certainly a way to stop dystopian mass surveillance nightmares if everyone capable of building one refuses.

    So if you live in the US and don’t want one government agency in the US to have this power (that is ambiguous under current law), one way you can try to avoid it is by refusing to sell it to them and urging others to do the same.

    It’s a long shot sure, but it certainly seems more effective than hoping the legislature wakes up and reigns in the executive these days.

  113. 113. psychoslave||context
    Given most government policies and direct engagement in all kind of monstrosities over the last millennia, there is really no reason to limit the case to USA, indeed.
  114. 114. finghin||context
    The acts of the government being wrong in an upsetting amount of cases would be a big reason.
  115. 115. tyre||context
    It’s not, but legal is not the same as ethical.

    For a long time, and probably still, it was legal for the US to torture enemy combatants. It was never ethical.

  116. 116. rob74||context
    If you add to that the very broad limits of what the current administration considers "legal" (as in "pretty much anything we want to do"), I can understand feeling uneasy as a Google employee...
  117. 117. gigatree||context
    You’d need some shared ethical/moral framework to make that claim, which doesn’t really seem to exist anymore
  118. 118. yibg||context
    You don't need a shared moral framework to come to a personal moral conclusion.
  119. 119. lo_zamoyski||context
    What does that mean? How does one come to a personal moral conclusion? Vibes?

    (I take "moral framework" to mean a principled stance that gives objective grounding for a moral judgement. I agree that we can come to a moral judgement without putting it through a systematic and discursive defense, and I reject the notion that there are many moralities or that they are arbitrary, but it is also true that diverging conceptions of the basis of morality will frustrate agreement. Stopping at personal moral judgement does not lend itself to fruitful dialogue and understanding, as it constraints the domain of what is intersubjectively knowable.)

  120. 120. hashmap||context
    working to directly advance a product used substantially to oppress people via surveillance or war crimes, when you have many other choices, is immoral. easy.