Forgot your password?
typodupeerror
AI

Omissions, Deceptions, Lying. The New Yorker Asks: Can Sam Altman Be Trusted? (newyorker.com) 79

A 17,000-word expose in the New Yorker reveals "several executives connected to OpenAI have expressed ongoing reservations about Altman's leadership." Reporters Ronan Farrow and Andrew Marantz spoke to "a hundred people with firsthand knowledge of how Altman conducts business," including current and former OpenAI employees and board members.

Among other revelations, internal messages from a few years ago show that OpenAI executives and board members "had come to believe that Altman's omissions and deceptions might have ramifications for the safety of OpenAI's products..." At the behest of his fellow board members, [OpenAI cofounder] Sutskever worked with like-minded colleagues to compile some seventy pages of Slack messages and H.R. documents, accompanied by explanatory text... The memos, which we reviewed, have not previously been disclosed in full. They allege that Altman misrepresented facts to executives and board members, and deceived them about internal safety protocols. One of the memos, about Altman, begins with a list headed "Sam exhibits a consistent pattern of . . ." The first item is "Lying"....

In a tense call after Altman's firing, the board pressed him to acknowledge a pattern of deception. "This is just so fucked up," he said repeatedly, according to people on the call. "I can't change my personality." Altman says that he doesn't recall the exchange.... He attributed the criticism to a tendency, especially early in his career, "to be too much of a conflict avoider." But a board member offered a different interpretation of his statement: "What it meant was 'I have this trait where I lie to people, and I'm not going to stop.' " Were the colleagues who fired Altman motivated by alarmism and personal animus, or were they right that he couldn't be trusted?

Friday Altman responded in part to the article. ("I am not proud of being conflict-averse, which has caused great pain for me and OpenAI," he wrote in a blog post. "I am not proud of handling myself badly in a conflict with our previous board that led to a huge mess for the company.")

But the article also assembled similar stories from throughout Altman's career: - At Altman's earlier startup Loopt, "groups of senior employees, concerned with Altman's leadership and lack of transparency, asked Loopt's board on two occasions to fire him as C.E.O.," according to Keach Hagey, author of the Altman biography The Optimist.

- During Altman's time as president of Y Combinator, "several Silicon Valley investors came to believe that his loyalties were divided. An investor told us that Altman was known to 'make personal investments, selectively, into the best companies, blocking outside investors.'" The article adds that in private, Y Combinator co-founder Paul Graham "has been unambiguous that Altman was removed because of Y.C. partners' mistrust... On one occasion, Graham told Y.C. colleagues that, prior to his removal, 'Sam had been lying to us all the time.'"

- "In a meeting with U.S. intelligence officials in the summer of 2017, he claimed that China had launched an 'A.G.I. Manhattan Project,'" the article points out, "and that OpenAI needed billions of dollars of government funding to keep pace...." But one intelligence official "after looking into the China project, concluded that there was no evidence that it existed: 'It was just being used as a sales pitch.'"

- As California lawmakers considered safety testing for AI model, one legislative aide complained of "increasingly cunning, deceptive behavior from OpenAI". OpenAI later subpoenaed some of the bill's top supporters (and OpenAI critics), in some cases asking for their private communications to investigate whether Elon Musk was funding them. [The article notes an ongoing animosity between Altman and Musk. "When Altman complained on X about a Tesla he'd ordered, Musk replied, 'You stole a non-profit.'"]

And "Multiple prominent investors who have worked with Altman told us that he has a reputation for freezing out investors if they back OpenAI's competitors." [M]ost of the people we spoke to shared the judgment of Sutskever and Amodei: Altman has a relentless will to power that, even among industrialists who put their names on spaceships, sets him apart. "He's unconstrained by truth," the board member told us. "He has two traits that are almost never seen in the same person. The first is a strong desire to please people, to be liked in any given interaction. The second is almost a sociopathic lack of concern for the consequences that may come from deceiving someone."

The board member was not the only person who, unprompted, used the word "sociopathic." One of Altman's batch mates in the first Y Combinator cohort was Aaron Swartz, a brilliant but troubled coder who died by suicide in 2013 and is now remembered in many tech circles as something of a sage. Not long before his death, Swartz expressed concerns about Altman to several friends. "You need to understand that Sam can never be trusted," he told one. "He is a sociopath. He would do anything."

Multiple senior executives at Microsoft said that, despite [CEO Satya] Nadella's long-standing loyalty, the company's relationship with Altman has become fraught. "He has misrepresented, distorted, renegotiated, reneged on agreements," one said... The senior executive at Microsoft said, of Altman, "I think there's a small but real chance he's eventually remembered as a Bernie Madoff- or Sam Bankman-Fried-level scammer."

This discussion has been archived. No new comments can be posted.

Omissions, Deceptions, Lying. The New Yorker Asks: Can Sam Altman Be Trusted?

Comments Filter:
  • by david.emery ( 127135 ) on Saturday April 11, 2026 @03:55PM (#66089174)

    The New Yorker Asks: Can Sam Altman Be Trusted?
    "No." https://en.wikipedia.org/wiki/... [wikipedia.org]

    • by burtosis ( 1124179 ) on Saturday April 11, 2026 @04:02PM (#66089180)

      The New Yorker Asks: Can Sam Altman Be Trusted? "No." https://en.wikipedia.org/wiki/... [wikipedia.org]

      So this might be the prick that pops the AI bubble?

      • by keltor ( 99721 ) * on Saturday April 11, 2026 @04:26PM (#66089196)
        Probably just the OpenAI bubble is going to pop. Same with the Tesla bubble popping. Anthropic doesn't seem likely to pop, but could as they are too big to buy now. Google isn't going to pop, Microsoft isn't going to pop - in both cases they have strong profits. Amazon AWS has strong profits, the Chinese companies have strong profits and all the Gen AI services companies would just fire the divisions and move on.

        The market as a whole is not in a bubble and that's pretty obviously if you look at actual market bubble stats.
        • by haruchai ( 17472 )

          Dario Amodei may be worse than Altman; he just looks less superficially like a sociopath

        • Google, Microsoft and Amazon are hella-overvalued and long-overdue to pop.
        • There is a huge AI bubble, it's just not going to pop with the top-tier model providers. They'll be fine. But the companies building data centers that are sitting on expensive, rapidly out-moded GPUs.. the thousands of companies that are building a single AI feature and hoping to be acquired.. the AI service companies trying to grab some market in some vertical.. There's a lot that's not going to work, and there's going to be a turning point where those guys start dying off and the money being thrown aroun
      • OpenAI is not the only major player right now. Google and Anthropic in the US, and to a lesser extent Meta/Facebook. OpenAI is just the largest. And there are a lot of Chinese models now also. And this isn't the only thing that has been said negatively about OpenAI or Sam Altman. There was one may recall the entire mess where they transitioned from being a non-profit to a for-profit and all the drama from that that largely revolved around Altman. When there's a bubble it is very tough to tell what finally i
        • It's important to distinguish desirability from path depency. Yes the Internet bubble burst and the AI bubble will burst, but often the technologies that remain can be attributed to the damage that the bubble did to the economy while it was happening, as opposed to being intrinsically desirable or efficient.

          For example, internet technologies and the way we architect stateful applications on Javascript and web browsers today may be viewed as highly suboptimal client/server computing, but the bubble distort

    • by haruchai ( 17472 )

      Beat me to it

      • by shanen ( 462549 )

        It was too obvious, but the problem is that the headline could always be reworded the opposite way. In this case "Is Sam Altman untrustworthy?" Then Betteridge's automatic "No" response would become an affirmation of his trustworthiness.

        On the record of his behavior I would say that Altman wants to act in a trustworthy way, but on the record of tech companies operating in the real world, I would agree with the "Heck no" responses.

        Citation required but hated? Latest is Facebook by the great Steven Levy. Qu

  • Sure, some CEOs are more ethical than others. But every last one of them wants to represent their company, and their own performance in it, in the best possible light. This means they will focus on things that make their company, and them, look good. And it means they will not willingly talk about things that make them and their company look bad.

    If you find a CEO that never colors the truth, take a picture, you have found a unicorn.

    • There is no excusing a pathological liar who repeatedly tells lies to deceive people with a financial stake in their company.
      • No excuses was intended in my post.

      • There is no excusing a pathological liar who repeatedly tells lies to deceive people with a financial stake in their company.

        And no excusing electing someone like that to be President. /s

        Okay, *maybe* the first time, giving someone the benefit of the doubt, but certainly not the second time. Just sayin' ...

    • by keltor ( 99721 ) *
      There's different types of CEO ummm coloring of reality. Altman's problem is lying to other CEOs ... That's a no-no.
      • Right. Lying to investors, lying to other CEOs, lying to the public, lying to customers, lying to regulators. They're all lying, they just have different consequences.

  • Almost no one can be trusted. Nearly everyone has imperfect knowledge, bias, an agenda, etc. And the more "well meaning" they believe their agenda the more open they are to inserting an agenda. Accurate or not, its a the ends justifying the means. A "white lie" that leads to a better outcome, or so they believe. This is normal human behavior, left, right, or center.

    The solution is simple. Verify what is said, regardless of whether you like or dislike what was said, regardless of whether the person is fri
    • There are levels. People may make wrong statements that they believe to be true. People may also make wrong statements that they know to be wrong, that is something altogether different. One is being honestly mistaken and can happen to anyone. The other is lying. The great problem that I see in modern society, is that when someone changes their position on something, because they have new information, they are accused of waffling or wavering. Stupidly clinging to an old position that you took in good
      • by drnb ( 2434720 )

        There are levels. People may make wrong statements that they believe to be true.

        As I said, "imperfect knowledge" is one of the factors. But the result is the same, one needs to verify the evidence. Perhaps we should repurpose your "level" clarification. How much work you put into evaluating the evidence depends on the "level". A friend says a movie is good, fine, trust and go see it.

        The great problem that I see in modern society, is that when someone changes their position on something, because they have new information, they are accused of waffling or wavering. Stupidly clinging to an old position that you took in good faith and now know is wrong is just as bad as lying about it intentionally.

        Which would be less of an issue if more people have a more scientific approach. Also, such accusations are useful. They help spot those that are the liars or the naively trusting. A hazard of all, the left,

        • Mostly the attacks on people changing their position are aimed at stopping them from doing so and thereby supporting the position they have found reason to abandon. Politicians often support positions they know are wrong, for their own profit. Checking the evidence is always a good idea but is not always practical. If the US says that the Chinese are secretly planning to invade Taiwan, for example, how could anyone expect to check. They presumably have access to sources that we do not.
          • by drnb ( 2434720 )

            Mostly the attacks on people changing their position are aimed at stopping them from doing so and thereby supporting the position they have found reason to abandon.

            That should undermine the credibility of the people attacking, those supporting the old position found to be incorrect.

            In other words, attacks should not be trusted with data to back them up.

            If the US says that the Chinese are secretly planning to invade Taiwan, for example, how could anyone expect to check. They presumably have access to sources that we do not.

            Well in a functioning democracy the executive branch would be verified by the legislative branch. The leadership of the House and Senate Armed Services Committees should have been advised and seen the evidence. Not a perfect solution, some trust is still involved, but at least there are two groups. Maybe three depend

    • CEOs and managers don't want "trust," they want control. That's why they like kiss-ups, they think they can control them.
    • > Verify what is said,

      Finding out information that only exists inside a company, particularly a private company not listed on the stock market, is extraordinarly difficult. The walls are closed. How should we go about verifying what is said?

      • > Verify what is said,

        Finding out information that only exists inside a company, particularly a private company not listed on the stock market, is extraordinarly difficult. The walls are closed. How should we go about verifying what is said?

        If its not verifiable then the claim is not to be trusted.

        Think about the appeal to authority fallacy. It's based on a "trust me". A true authority does not wave credentials and say "trust me." A true authority can show and explain the data in a way the masses can understand, so that the masses can see for themselves the validity of a claim. The same applies to corporates spokespeople. Its a fallacy to believe them merely on their job title.

  • Why did they make him CEO if they know so clearly that he can't be trusted?

  • by Arrogant-Bastard ( 141720 ) on Saturday April 11, 2026 @04:57PM (#66089242)
    This man doesn't care what he breaks or destroys, who he hurts or kills. There is absolutely no compassion or empathy in him. He's a monster.

    If you think that's harsh, first let me assure that it's not. Second: read the article. And third: or just pay attention to what he's said and done.

    "When someone shows you who they are, believe them the first time." --- Maya Angelou
    • "He's a monster." Interesting! so what does that say about the OpenAI employees that threw the hissy fit when the board tried to deal with him?
      • by Anonymous Coward

        They were told that their stock incentives would be worth nothing unless they all went to twitter and posted "openai is nothing without it's people", so that's what they did.

    • by gweihir ( 88907 ) on Saturday April 11, 2026 @07:10PM (#66089404)

      Indeed. No argument. What I am still surprised at is that people fall for these types again and again.

      • CEOs (and corporate hierarchy types) mistake sycophants for trust. Maybe they think they can control the sycophant, and they often can.
      • Well, the last time he was fired it turns out that Satya Nadella intervened.

        (Now you know who to blame.)

      • by Deef ( 162646 ) on Sunday April 12, 2026 @08:44AM (#66089926)

        There are reasons not to be surprised by this, but it requires some research.

        Specifically, TFA says:

        "He has two traits that are almost never seen in the same person. The first is a strong desire to please people, to be liked in any given interaction. The second is almost a sociopathic lack of concern for the consequences that may come from deceiving someone."

        However, the person who said this is actually wrong. These traits are actually VERY COMMONLY seen together in people who have Narcissistic Personality Disorder, which is actually quite a bit more common than most people realize.

        In fact, these two traits are seen so commonly as to be considered some of the defining traits of the disorder. They are generally phrased as something like "need for approval or praise from others" (this approval or praise is sometimes referred to as "Narcissistic Supply" because narcissists will often chase it even more intensely than money, power, or other apparently more useful goals) and "lack of empathy" (meaning the narcissist tends to be blind to the emotions of others, and tends to assume that, for instance, if he is happy that other people around him are happy, and so forth). There are nine common diagnostic criteria for the disorder, and these are two of them. (Having 5 or more of those 9, as judged by a qualified mental health professional, is diagnostic for the disorder.)

        https://my.clevelandclinic.org... [clevelandclinic.org]

        If you study the other common attributes and typical behaviors of people with the disorder, you start to see why people keep falling for these types of people. Narcissists excel at praising people that they think are useful to them or who have high status, and attacking those who threaten to expose them as being less than the perfect and admirable 'false self' that the narcissist tries to pretend is who they really are. They can become experts at manipulating people to give themselves praise and attention. Over time, this essentially becomes a form of conditioning of those people to obey the narcissist at all costs, and can function similar to brainwashing. They also frequently triangulate to use one person or group of people that they control as a means to manipulate others. (Aka "Flying monkeys": that's a technical term in therapy. Really.)

        Note the similarities to cult indoctrination. Narcissists very commonly form cults of personality around themselves. Also, someone who has been "trained" by a narcissist is also more likely to fall under the sway of another one, since their defenses (boundaries, etc.) have been damaged and may not function well anymore.

        Also, dating a narcissist is one of the worst things you can do for your mental health, and is on the same level as dating someone who is physically abusive. It can cause long-lasting psychological damage that lasts for decades after the relationship is over. With a narcissist, unlike a physical abuser, though, the scars don't show...and even the abused may not understand that they have been abused because they've been taught that they were the problem all along. Many narcissists can do this sort of thing even without really intending to, not out of deliberate malice (although "malicious narcissism" is a subtype), but just due to the way their worldview works: they see the other person as at fault, and can even rewrite their own memories to cast themselves in the role of a hero or victim even when actual events (e.g. video recordings) prove otherwise, since their desire to always see themselves in a positive light is so strong.

        The sort of behavior described in TFA is all completely expected for someone with NPD.

        (Note by the way that many sociopaths and psychopaths are also narcissists: the disorders are by no means mutually exclusive.)

        As an exercise to the reader: Try grading other notable wealthy and powerful people against the criteria mentioned in the article I linked to a

        • by gweihir ( 88907 )

          No argument. And we have some nice other examples of people that clearly suffer from this in positions of power in the US. What I am wondering is why there is no evolutionary created defenses against being manipulated by these people. Maybe for a tribe-sized group, having a leader of this type is an advantage. Or maybe that is the wrong question to ask.

          • by Deef ( 162646 )

            To answer your question: there are defenses against narcissists. The problem is that these defenses aren't typically useful the first time a narcissist is encountered, since people are usually not expecting an apparently nice, friendly, supportive person to end up behaving this way: narcissistic behavior is contrary to normal human behavior, and therefore usually unexpected.

            Narcissists are typically most successful when dealing with people who don't know them well. They can be incredibly charming on first c

  • The guy is a caricature of an investment banker, has been from the start of OpenAI. Bit late to complain about it now.

    The better question, why did everyone throw money and media adoration at him when it was already obvious?

  • -I asked my friend "Satan Lord of Darkness, Deceiver of All, Most Unclean" and he says Altman is cool and pre-sold him a bunch of OpenAI stock, so trust away!
  • by 93 Escort Wagon ( 326346 ) on Saturday April 11, 2026 @05:47PM (#66089322)

    I wasn't lying to you - I'm just conflict-averse!

  • A 17,000-word expose

    What is it: expose or exposé?

    Ah fuck, /. canâ(TM)t handle that. No wonder itâ(TM)s written wrong. Where are the anti-Apple trolls to tell me to turn off smart quotes? As if that would help?

    • Trolls are people who say shit they don't believe just to piss you off.

      Nothing could be more like drinking the piss of the ghost of Steve Jobs than believing people criticizing Apple are doing to make you mad.

    • What is it: expose or exposé?

      It's exposé. Without the acute accent, it's a spelling mistake.

      Some english-speakers are accent-averse, no matter what the cost in clarity and consistency.

      Without the acute accent, what are we to make of an effort to expose someone with an exposé?
      Or that Marion's gold lamé top looks lame on her?
      Or that I'm going to resume writing my résumé?
      Without it, the word 'née' would seem to belong in a Monty Python sketch about knights and shrubbery.

      Praise accents! They're useful.

  • Why is this even a question? Are they stupid or what?

  • by Dripdry ( 1062282 ) on Saturday April 11, 2026 @09:31PM (#66089520) Journal

    He was fired from Y Combinator and the people at his start up, Loopt, ask the board to fire him because of his chaotic and deceptive behavior.

    The guy is a disgusting, lying, cheating, sociopath who couldnâ(TM)t run a company if Reebok gave him all their shoes.

    • And he stole the company from the nonprofit set up to keep tabs on its ethics. Perhaps one of the biggest thefts in history. I don't know why anyone trusts this guy.

  • 2022: "AGI in 2 years"
    2024: "AGI in 2 years"
    2026: "AGI in 2 years"
    Answer is pretty clear.
  • OpenAI kick-started the multibillion-dollar industry and made what appears to be AI accessible to millions of people.

    This article still reads like someone wants to displace him. It looks like an attempt to dig up dirt on him.

    Sam is probably not perfect. Very few of us are. The question is: does he advance OpenAI and AGI? Do OpenAI researchers feel like they have the freedom and ability to achieve their goals and complete their tasks successfully?

    If the answer to both questions is 'no', he might be replaced.

  • He reminds me of someone running a social media empire.
  • That you're willing to believe any capitalist can be trusted, even at this point when the truth is literally fucking you in the ass - i mean, ngl, they're not wrong when they say they're just giving you what you deserve.

    They're legally required to hide the truth where the truth would hurt investors. That's the actual LAW. You're an IDIOT for believing anything any of them say.

    The only "new" thing here is the suggestion that the conflict-averse could somehow thrive as a CEO. That he's a compulsive liar ra

Give a man a fish, and you feed him for a day. Teach a man to fish, and he'll invite himself over for dinner. - Calvin Keegan

Working...