On May 3, 2021 I wrote a note to myself about the type of people OpenAI was hiring and this was the note: looks like OpenAI is getting into the military business by hiring a former CIA clandestine operator Will Hurd https://en.wikipedia.org/wiki/Will_Hurd. Seems like I was right but this should be expected because every corporation is in one way or another linked to the military industrial complex.
Unsetting 'max-height' on `#u-s-military-makes-first-confirmed-openai-purchase-for-war-fighting-forces` reveals the rest of the article for me, so it's there, just hidden.
Because it assumes OpenAI being used for anything remotely interesting rather than writing report summaries. This may have a fun media summary, but until unless significantly tweaked, it's still just an expensive multimodal model. It's not going to be used outside of offices.
Have you never heard of the banality of evil? The nazis used IBM punch cards. IBM was in the death business, even if they weren't making bombs or nerve gas.
Sure. Once OpenAI starts making something actually targetted towards the army at their request, that's a different thing. But as far as I understand, so far the army just bought a volume licence to a public service.
The actual line is:
> Advanced AI/ML Capabilities: Utilization of Microsoft's native AI services,
including Azure AI Search, OpenAI tools, and Azure Synapse for unified
analytics and big data processing.
And food supplies and healthcare and energy and transportation and housing and... At the US army scale, what isn't death business? There's almost every profession with some connection. As long as OpenAI sells what they normally sell to everyone anyway, I don't see a problem. The document doesn't even mention any special deals.
It really isn't, in my experience. What makes you think "everything military = death business"? The military tends to be mostly logistics, and ends up funneling resources to tons of efforts that benefit the nation as a whole. One such investment is CCDC SC, which was the subject of a 99 PI podcast, IIRC. They've worked for decades to research and operationalize shelf-stable foods, for example. These approaches have made it into civilian foods for the same reasons they are used in the military: convenience, especially under dire circumstances.
Because the army doesn't build levees on the missippi and gps & Internet are strictly used to murder people
Just like the Roman legions only built roads for murder conveyors and weren't ever used by normal citizens
Facial recognition with a proven bias against certain minorities being used for ICE? Yeah, I got a big problem with that.
Some private or low level staffer looking for some boilerplate language or editing feedback on a PowerPoint presentation? Using chatgpt as a better Google translate instead of getting more locals and their families into a precarious situation? Not a problem imo.
>Some private or low level staffer looking for some boilerplate language or editing feedback on a PowerPoint presentation? Using chatgpt as a better Google translate instead of getting more locals and their families into a precarious situation? Not a problem imo.
that's beyond the point.
parent objects to the war machine -- even those 'low level staffers and privates' are contributing meaningfully to the machine in which they operate themselves.
if the end result of a mundane process is that a bomb gets dropped and lots of people die, it doesn't really much matter to most people how minor the role is; like it or not even the guy getting the coffee for the general is a part of that process which ends in an action that parent objects to.
people oft forget this fact, which leads into a slippery path of 'condoning via non-action' because "well it's just coffee!", until it all adds up into a 'war crime' or 'tactical victory'.
War is terrible. Nobody likes war. Yet war is also the norm for as long as human being exists. Saying military is the war machine is a cheap emotional appeal.
Military also fight to protect civilians when war break out.
Military is the deterrence force that prevent wars.
Military plays a key role in rescue and recovery when natural disasters hit.
Most military forces have more peace time than war time duties.
If no one ever downvotes what you say, you're probably not taking enough risks in your commentary. (But if you know most people will downvote it, especially if you know it's going to be flagged, you probably shouldn't say it. Not for any big reason, just because it's rude.)
On May 3, 2021 I wrote a note to myself about the type of people OpenAI was hiring and this was the note: looks like OpenAI is getting into the military business by hiring a former CIA clandestine operator Will Hurd https://en.wikipedia.org/wiki/Will_Hurd. Seems like I was right but this should be expected because every corporation is in one way or another linked to the military industrial complex.
Does the paywall on the archive page mean we cannot archive theintercept? : https://archive.is/Lqfyr
Unsetting 'max-height' on `#u-s-military-makes-first-confirmed-openai-purchase-for-war-fighting-forces` reveals the rest of the article for me, so it's there, just hidden.
So now hallucinations can have deadly consequences.
Good job Sam Altman and all the employees who backed his return.
You are now in the death business.
This makes several logical leaps that are unwarranted. It would be interesting to hear more about how they are using it, though.
My experience with the military is that there are huge number of functions involved, and actual warfighting is a small fraction of that.
It's basically the world's largest logistic enterprise. It's like Amazon but for much heavier/more dangerous goods and service areas.
Meanwhile Whisper is busy hallucinating while it transcribes you and your doctor's most recent encounter
Good, I'm glad someone in SV ain't afraid of working with the military.
Freedom delivery business, please.
Idk why this is downvoted...
Because it assumes OpenAI being used for anything remotely interesting rather than writing report summaries. This may have a fun media summary, but until unless significantly tweaked, it's still just an expensive multimodal model. It's not going to be used outside of offices.
Have you never heard of the banality of evil? The nazis used IBM punch cards. IBM was in the death business, even if they weren't making bombs or nerve gas.
Sure. Once OpenAI starts making something actually targetted towards the army at their request, that's a different thing. But as far as I understand, so far the army just bought a volume licence to a public service.
The actual line is:
> Advanced AI/ML Capabilities: Utilization of Microsoft's native AI services, including Azure AI Search, OpenAI tools, and Azure Synapse for unified analytics and big data processing.
analytics and data processing by the military is by definition the death business
And food supplies and healthcare and energy and transportation and housing and... At the US army scale, what isn't death business? There's almost every profession with some connection. As long as OpenAI sells what they normally sell to everyone anyway, I don't see a problem. The document doesn't even mention any special deals.
It really isn't, in my experience. What makes you think "everything military = death business"? The military tends to be mostly logistics, and ends up funneling resources to tons of efforts that benefit the nation as a whole. One such investment is CCDC SC, which was the subject of a 99 PI podcast, IIRC. They've worked for decades to research and operationalize shelf-stable foods, for example. These approaches have made it into civilian foods for the same reasons they are used in the military: convenience, especially under dire circumstances.
https://en.wikipedia.org/wiki/Combat_Capabilities_Developmen...
Drug dealing is also mostly logistics, so they are not in the crime business?
>>What makes you think "everything military = death business"?
mil·i·tar·y /ˈmiləˌterē/ adjective relating to or characteristic of soldiers or armed forces.
Hope this helps.
Because the army doesn't build levees on the missippi and gps & Internet are strictly used to murder people
Just like the Roman legions only built roads for murder conveyors and weren't ever used by normal citizens
Facial recognition with a proven bias against certain minorities being used for ICE? Yeah, I got a big problem with that.
Some private or low level staffer looking for some boilerplate language or editing feedback on a PowerPoint presentation? Using chatgpt as a better Google translate instead of getting more locals and their families into a precarious situation? Not a problem imo.
>Some private or low level staffer looking for some boilerplate language or editing feedback on a PowerPoint presentation? Using chatgpt as a better Google translate instead of getting more locals and their families into a precarious situation? Not a problem imo.
that's beyond the point.
parent objects to the war machine -- even those 'low level staffers and privates' are contributing meaningfully to the machine in which they operate themselves.
if the end result of a mundane process is that a bomb gets dropped and lots of people die, it doesn't really much matter to most people how minor the role is; like it or not even the guy getting the coffee for the general is a part of that process which ends in an action that parent objects to.
people oft forget this fact, which leads into a slippery path of 'condoning via non-action' because "well it's just coffee!", until it all adds up into a 'war crime' or 'tactical victory'.
War is terrible. Nobody likes war. Yet war is also the norm for as long as human being exists. Saying military is the war machine is a cheap emotional appeal.
Military also fight to protect civilians when war break out.
Military is the deterrence force that prevent wars.
Military plays a key role in rescue and recovery when natural disasters hit.
Most military forces have more peace time than war time duties.
It helps to understand your misconception. Less than a fourth of the US armed forces are in or will ever get put into a combat role.
You mean summaries like for this journalist were the AI confused the cases he wrote about with convictions he got?
https://www.niemanlab.org/2024/09/a-courts-reporter-wrote-ab...
The same could happen with summaries to decide if someone is a threat
> ... writing report summaries.
Hallucinated report summaries could go quite badly if used uncritically by decision makers.
Especially if they're ingested by AI systems further up the decision making chain(s).
If no one ever downvotes what you say, you're probably not taking enough risks in your commentary. (But if you know most people will downvote it, especially if you know it's going to be flagged, you probably shouldn't say it. Not for any big reason, just because it's rude.)
[flagged]
[dead]
[flagged]
[flagged]