I think this is a general cultural phenomenon not specific to the sciences- people are told they need to be aggressively confident to be taken seriously. Even as a kid parents and teachers will suggest to kids to edit words expressing uncertainty out of their writing. They are coached to talk like that when public speaking, at job interviews, and even dating.
As a scientist- I also think papers and grant applications are more likely to be accepted if they use an irrationally overconfident tone, at least in the abstract and introduction.
Agree. At least in my western country, it has become almost expected that everyone is fully confident all the time.
I’m someone who likes to deliberate on thoughts, take my time, listen to opinions and assess them, then provide my opinion (which may also change over time with more information or changing circumstances). I am humble, a listener, and don’t believe I know everything. I feel a little out of place in current society.
One of my professor at uni (he managed to talk to Bill gates to get Microsoft to fund his startup, which is now an established research center) told us that if we don't know something, being honest and saying you don't know it is usually a good sign. Be it at an oral exam, be it when you're answering question at a startup pitch (...)
I really appreciated when a professor, when asked with a question he doesn't know, stops trying to answer and says something in the lines of "I actually don't know the exact answer, I'll try to find and answer and tell you next lesson"
In a way the school system teach people to fear not knowing something, because at an exam it mean getting a bad grade. At work, it is also often frown upon. This is indeed a problem in science since the goal is precisely to find more about unknowns.
I had another enlightened professor that also founded a startup that talked about how much is it frowned upon to admit to be wrong at work. Bear in mind in his opinion this is especially true for Italian workplaces, but I can imagine it can be somewhat true for other places as well. Knowing why things went wrong is much more useful than knowing who did it and pointing fingers to him/her
>it has become almost expected that everyone is fully confident all the time
I still remember my first discussion with a CEO in my first job as a programmer (not even out of uni yet). My project was delayed because of a buggy installer and I had problems reproducing the issue.
I was am repeatedly asked "how long until the fix" and I kept saying "I don't know, it's a bug, can take 2 hours can take a week".
We both left this conversation frustrated. Though of course for her it was business as usual, and for me it's still awkward and stressful when I think back to it.
Anyway this taught me to confidently state something like "this will take three days" instead of trying to convey uncertainty to the higher management.
I think there’s personality traits at play here also- simply being introverted means you are more focused on if your ideas make sense to you, than how they will be received or what they will make other people think about you.
I think there are intelligent extroverts that just aren’t really that worried about if their opinions are correct or not except in rare cases where the consequences of being wrong would be severe.
When I was younger I had a bad habit of taking what information I did know and then piecing it together to try and fill in what I didn't know, then stating that confidently. Many times I was wrong. In fact, most times I was wrong and not even close. But I was expected to be that way in order for people to listen to me. I was doing this almost until I was twenty because of that expectation.
The one good thing that came out of this maladapted behaviour is that I learned to take my hunch, piece together the information, and then use that to go looking for what I didn't know. In a way it got me addicted to constantly learning, something I never managed in school.
That is very much a normal human condition honestly. In my experience what I’m seeing from time to time is younger folks have way more confidence in things they’re wrong about than I did when I was their age. I had some confidence no doubt but ample room for error where I would run a “trust but verify” pattern on my own bullshit regularly. I learned from who knows what that it’s bon ton to say “I don’t know” so I got into the habit of withholding my theories in favour of yielding to the other parties. Of course it helps in tech and business to be discerning. But I’m seeing less of this and way more of baseless, unfounded conclusions that have not stemmed from deductive reasoning, conflicting research and so on but rather just-so statements and single-dimension emotional reflexes. It’s clear the presenter wholly lacks experience, but we play along because it’s amusing.
The hard part is articulating to someone to slow down because we run the risk of them being offended. But that’s a different thread altogether.
Your account is dead but none of the posts in your history look bad. You can email hn@ycombinator.com and if there has been a mistake they'll sort it out.
Part of it is the English language doesn't have tools to express uncertainty without adding verbosity. For example, take what I said here, my sentence length increased by 25% by saying "part of it is" to communicate uncertainty. This gets tiring quickly for both the writer and reader. It becomes boilerplate.
That's a good point. As a non-native speaker, I use Grammarly to improve my writing. But often, when I add clauses that indicate uncertainty, grammarly complaints about how I can be more concise by dropping those parts.
Fair, we at least have the subjunctive, we just don't use it very often. If I were to start a sentence like this, I would be using the subjunctive.
(Which I believe has been confirmed by Gemini):
> Yes, the sentence "If I were to start a sentence like this, I would be using the subjunctive" uses the subjunctive mood correctly.
> The subjunctive mood is used to express hypothetical or unreal situations, wishes, suggestions, or demands. In this sentence, "If I were to start a sentence like this" presents a hypothetical situation, and "were" is the correct form of the verb "to be" in the subjunctive mood."
But yeah, it doesn't get at the uncertainty expressed in other languages. Is there another mood for that or how do other languages handle it?
As one teacher and several profs told me, "You don't need to say, 'I think', because the fact that you are saying or writing it means that you think it". In other words, if you are uncertain, don't express the idea, because no amount of verbal hedging will protect you if you express something that is incorrect.
However, the best way to get the correct idea if you are wrong and unsure is to confidently propose your idea as if it is correct - Cunningham's law applied to everything.
writing 'I think X', 'there a chance that X' instead of 'X is', 'I know that X'
> if you are uncertain, don't express the idea, because no amount of verbal hedging will protect you if you express something that is incorrect.
The problem with this is with the target audience of your writing,
if you're assuming an adversarial audience vs a cooperative one.
the bigger and less self-filtered the audience is the more likely it is to become of the first kind.
> the best way to get the correct idea if you are wrong and unsure is to confidently propose your idea as if it is correct - Cunningham's law applied to everything.
Personally when I see someone being confidently incorrect consistently I just filter them out and their opinions instead of correcting them because a lot of time it's a lost cause.
for example Reddit and twitter, the sheer amount of people that are overly confidently incorrect, it made conversations not worth having there so I just stopped many years ago.
Everyone has their threshold for how many times you are allowed to be incorrect within a given context, sure.
> adversarial audience...
Yes. Even if the audience you target is small and on your side today, a successful piece of writing will gain an adversarial audience, eventually.
> If you give me six lines written by the hand of the most honest of men, I will find something in them which will hang him. - Cardinal Richelieu (apocryphal paraphrase from hearsay apparently but still a great quote).
And yet one of the best ways to lose respect in a technical context, especially as a junior, is to be confidently incorrect. Showing humility helps a hell of a lot for all concerned if you've in fact misunderstood something.
A better interpretation is to express the uncertainty at its core, and not wrapped into your subjectiveness.
For instance if you read it in a publication and kinda agree but don't fully trust it, instead of "I think X" you could go with "Publication Y exposes the possibility of X"
The real point of the advice is to get rid of fluff, not remove nuance or valuable information.
An explanation some teachers beat into me over the years is that everyone already knows you're only stating a thing because it's your opinion. It's discourteous to waste time repeating something so basic, so such hedging language should only be used when it actually adds something to the conversation -- warning the reader of something with an unexpected level of certainty.
On some level, that makes even more sense in scientific writing. You already have method sections, experimental results, .... If you haven't been overconfident in those sections then the reader really ought to have all the tools they need to appropriately examine the paper.
We lose nuance by only usign maximalist statements. Take these rather typical intro-phrases:
We find [strongest result]
Our findings strongly indicate [likely result]
Our findings suggest/indicate [result that needs more research]
The way I and many other people read papers is that we read method and appendix last, good writing can give you hints at those things without needlessly bloating the paper. The nuance also makes me trust the authors more, as it indicates humbleness and diligence
Huh? People sometimes state things that are the genuine facts, as far as anyone has been able to prove, but that are not entirely possible to be 100% verified.
This isn’t some obscure niche case either, it happens every day on HN.
If we're interpreting what I wrote charitably, it's implicit that "obvious" facts will extremely rarely have hedging language, so the anti-hedging conversation can focus elsewhere.
I _knew_ somebody was going to nitpick that phrase and rejected the urge to write a few extra paragraphs in the format of a HN comment expressing the idea more completely...
If we want to look at it a bit less charitably though, even things that are true "as far as anyone has been able to prove" are usually not perceived to be true for a significant fraction of the human race, so in interactions with randos (like this current thread) it's still reasonable to assume that we're talking about opinions rather than facts. Even when talking about "facts" (as you partially defined them), the core idea still holds -- your _opinion_ (maybe a very well validated opinion) is that those facts are true, and hedging language is discourteous to the reader.
Seemingly random claims, that no one has been able to provide proof for so far, but the person has some special reasons to strongly believe it is true.
Also e.g. MS Word suggests getting rid of qualifiers.
With papers and especially grant applications it's ridiculous that everybody knows they are overpromising but still they don't go through of they don't overpromise.
Before i send something i make sure each paragraph has at least one it of green underline. The day that Word and I agree on grammar is the day i switch back to pencil and paper!
> Even as a kid parents and teachers will suggest to kids to edit words expressing uncertainty out of their writing. They are coached to talk like that when public speaking, at job interviews, and even dating.
That's the first time I'm hearing this and it's certainly not something I've ever observed. In this broad generally, I cannot believe that this statement is true.
I'm not sure it's presented as removing expressions of uncertainty.
But modern writing instruction tend to prioritize precise verbs and nouns (choose the most precise word) and takes a very critical eye towards adverbs, adjectives, or prepositional phrases that might simply contribute to cadence, affect, and voice. I find it really hostile to style and readability, but it's easy to evaluate in education because you can just call out the flavoring bits as errors and point to a more precise dictionary word for what's left. Plus, it projects objectivity simply in it being "the most right way" to write, leaving the dirty mess of subjectivity, character, and humanity out of the classroom.
But the relevant end result of all that is that there's very little room for qualifiers or tentatitiveness, as those are usually expressed exactly in those subtle modifiers that people are trained to exclude.
And then this does end up coinciding with a truth of rhetoric, which is that -- much of the time -- the receiver of speech or writing will implicitly accept whatever authority and surety you project. If you sound sure, you must be sure, and if you're so sure, you must have reason to be, so you're probably right.
Put these together, and you get dull, dense technical writing with authoritative, objective tone and people are too overwhelmed and passive to bother parsing it enough challenge it.
Are you american? As a non-american living in the US, the statement from parent post sounds about right to me. Culturally, people sound very confident, even when they have little to no confidence on the content.
That's good advice in that context and speaks to the difference in standards of evidence between normal expository or persuasive writing in school and scientific writing. In a scientific paper if you assert something it's important to flag to the reader what kind of claim this is, eg something demonstrated by other research, something that directly follows from your evidence, or something that could possibly be true given prior knowledge and your evidence.
It was definitely true in my high school English classes ~20 years ago that we would get red ink on our papers if we used what were termed "weasel words" like "suggests" and "believe". I have no idea how representative that was, or which direction the trend has been since then. We were also taught to use short, declarative sentences wherever possible, Hemingway being the ultimate example of the "beauty" of economy in language.
It is my understanding that this is a very USA-centric trend in writing, probably being exported along with the rest of the complex basket of American cultural quirks that come along with US-led internationalization (hegemony if you like) since WWII.
They are generally praised in the abstract, but in concrete terms of actual people in terms of attracting a contemporary following (and not distant, historical, and academic respect), boldness has usually trumped humility. The idea that there was a time when people didn't flock to the overcompetent people presenting strong-leader images over people with genuine humility is just one of the many ways people create a mythical past to paint the manifestation of long-present aspects of human culture as novel degeneracy into which the world has recently fallen.
> in concrete terms of actual people in terms of attracting a contemporary following (and not distant, historical, and academic respect), boldness has usually trumped humility
I don't think that's true: Confucious, Jesus, George Washington, Lincoln, MLK ... Eisenhower, every president before Trump, the New England culture of looking down on ostentation and displays of wealth, .... I read a New Yorker article several years ago about a culture of Wall Street leaders in the 1980s who purposely wore cheap watches, had homes with low fences, etc.
Jesus had a relatively small band of followers that abandoned him at the first sign of unpopularity with authorities, pissed off more people than followed him, and was murdered by the public authorities at the demand of the local population. He got a bigger posthumous following after (actually or in a myth created by people seeking their own influence) rising from the dead.
Says a lot more about the respect for miracles than respect for humility, however much one might read what is written about him as calling for humility from his followers. Meanwhile the list of historical figures whose rise to influence was fueled by the exact opposite of humility is...not short.
> every president before Trump,
Very many of the Presidents before Trump were...not known for humility in their time. Some have had that added to their popular myth afterwards, but, I mean, plenty not even that, and probably the most recent one that was (I'm...not young, and Carter is probably the only one in my lifetime you might make a case that it was something he was particularly known for.)
I don't want to get into religious debate, but you might want to re-read the Gospels - you are missing the fundamental message of Jesus, which has turned out to be extremely popular - arguably the most popular text and message in the world.
> Very many of the Presidents before Trump were...not known for humility in their time.
Who? What did they do? Nobody like Trump or other people I named.
> I don’t want to get into religious debate, but you might want to re-read the Gospels - you are missing the fundamental message of Jesus, which has turned out to be extremely popular - arguably the most popular text and message in the world.
You might want to reread the upthread context: I was discussing the role of humility “in concrete terms of actual people in terms of attracting a contemporary following” and distinguishing that form backward-looking regard (since we are comparing to how Trump is currently viewed, where we can see only the contemporary regard and not what future generations will think of him in retrospect.)
What has happened after the time of Jesus ministry with respect to the impact of works written about him is irrelevant.
What was the attitude toward humility 2000 years ago, and where? I personally have no idea.
Jesus is certainly far, far more influential than 2000 years ago. Christianity was an obscure religion and probably nobody had heard of Jesus outside the immediate region.
Jesus' very clear message of humility is published in a massive best-selling book that got gold filigree letters in the medieval manuscripts and which was promoted in buildings that were for a thousand or so years the largest and grandest that most people ever saw, and led to the creation of the famously wealthy Catholic church (whose stuff was famously confiscated by Henry VIII when he needed more gold) led by popes whose desire for expensive art led to the sale of indulgences that led to Martin Luther and a massive europe-wide series of conflicts, and also led to the less famous but more infamous Knights Templar who didn't survive having their money taken.
Parable of the sower comes to mind: hear the word, but did not follow it.
Except the popularity in that era was due to all the powerful people saying it proved their right to absolute power and sometimes made other religions (including denominations of Christianity) illegal, and popularity waned rapidly as it stopped being required.
Actually reading the bible is a big part of why I switched from Catholic to Wicca as a teen before deciding it was probably all ahistorical anyway.
(I'm not trying to sell Jesus or Christianity; I'm just talking factually about the text of the Gospels. For those unfamilar - the Gospels are the first four books of the New Testament, in which the authors describe Jesus' actual words and life, often quoting Jesus. The rest of the New Testament is other people's responses to about Jesus and Christianity.)
As I understand it: Humans are complicated, as Jesus depicted them. They are both terminally weak and flawed, and there is also good and justice and mercy in them - the many angels of our natures. Jesus ministered to the weak and flawed, not the good and just, certainly not to the perfect. It's a message of love to people who sin, which is pretty much everyone; Jesus didn't expect differently. His message was to love the sinners and for the sinners to accept his love.
All human institutions are flawed and to a degree corrupt; they are run by sinners. I'm not defending the Catholic Church or any church or religion, my point is that if we throw out the good when there are flaws, major flaws, we are left with nothing. Not even ourselves.
Musk is not a figurehead of confidence, and really is nothing like SBF. In fact, he’s regularly very direct about conveying the level of uncertainty when discussing the possibility of RUDs with Starship or being optimistic about deadlines for Tesla when he’s speaking in long form and not taken out of context with a sound bite.
You're rewriting history. We don't need to list all the extreme claims and exaggerations. And his 'short form' communication is not someone else taking him out of context, but how he regularly communicates - on X.
Often being trapped in indecision is a sign of fear and insecurity rather than calm consideration. But sometimes people go into things headstrong and lose because they didn't make any calculations beforehand. There's a balance, or maybe a dialectic, between thinking and non-thinking action.
I think it’s a newer culture pushed by the media and universities to accept scientific claims without questioning them. They usually cite whatever they say without evidence. What a scientific consensus or even unnamed scientist say can be used to dismiss others’ points. It’s almost like they’re the clergy of today. Some are calling this “scientism.”
We need to go back to treating any claims with healthy skepticism, reviewing them, and phrasing things carefully to show the degree of certainty uncertainty involved. We need to protect the integrity of both science and scientific reporting. That’s better for us in the long-term.
There is a problem too that scientism has a dual meaning.
There is the belief in science as the best means to get at the truth and then the pejorative of using science as a type of orthodox religion. So for the true believer in the pejorative you can't even use the term.
Personally, I think this is a lost cause. Scientism in the pejorative sense is a new religion and most likely we are at the very early stages. Even when it comes to lifting weights, every youtube video pretends to be "based on science".
I might be too old to see it happen but I fully expect at some point in the future instead of a movie getting 5 stars, the commercial will say this is scientifically proven to be great movie. A restaurant will claim to have the best tacos based on science.
This is clearly the road we are on and I don't see a way back. Actual scientist will have to be more and more certain in their language because that is what scientism in the pejorative sense is to the true believer and the cult of the true believer in pejorative scientism is growing.
As a scientist this stuff irks me more than the openly anti-science people- because it deprives science of the whole point- to be able to question and understand things yourself.
I dislike how every food item, exercise, etc. is now “scientifically proven to be optimal.” And if you look at the actual research it will show that some biomarker was higher in some comparison… but who is to say having that higher is better, and that they were actually measuring the only relevant factor out of thousands of possibilities to measure? We simply don't know nearly enough about biology to measure one biomarker and conclude that it proves something is "optimal" for the health of every member of an entire species of complicated organisms over their entire lives. If I'm going to lift weights I don't care how it modulated some level of some immunological molecule the authors were studying because it's what they happened to get funded to measure- I just care if when other people tried it if they got stronger, and didn't get hurt... something I can learn from word of mouth rather than peer reviewed literature. And you will get called “anti science” if you question it… as if questioning things that don’t make sense wasn’t the main point of science. /rant
People misinterpret science as being a collection of facts discovered by authority figures rather than a process of questioning the validity of any claims made. It's almost as if the concept of science has turned into the religion of the atheist. Unquestionable, and dictated by more holy figures than them, in the form of scientists, and in some cases politicians, rather than clergy.
I can see how they came to see it this way, since that is how science was taught to me in public school... I only knew better because both of my parents were scientists, and I grew up helping them with their research.
I think that could be changed by changing how science is taught: give people the tools to experiment and reason, and figure out an answer on their own. I'd like to see more kids volunteering to work with real scientists, and see what the work is like firsthand.
These science-as-religion people often attack me online, and in real life when I share my own ideas... and then back down and flip almost to some creepy worship when they figure out I'm an academic PI. No reasoning or evidence can convince them of anything, but credentials instantly do.
I'd like to explain to them- as an officially ordained "science priest" with the degree and job title- that believing things based on authority rather than understanding yourself is the most heinous blasphemy possible. /s
Maybe that's why people are like that, but I think it's more baked into most people than we care to admit. I think part of our human wiring is to have faith. It used to be in deities, now it is in "science!" Humans, most of them anyway, are also wired to follow authority blindly. I really don't think there's any way around it. It's not that people refuse to be skeptical, it's that they are probably incapable of it.
Agree. At least in my western country, it has become almost expected that everyone is fully confident all the time.
I’m someone who likes to deliberate on thoughts, take my time, listen to opinions and assess them, then provide my opinion (which may also change over time with more information or changing circumstances). I am humble, a listener, and don’t believe I know everything. I feel a little out of place in current society.
Wholeheartedly agree. This issue comes up in job interviews and forces people into faking confidence in spite of not knowing. Honest people be damned if they give the impression of anything less than absolute confidence.
As a grad student it was beaten out of me to be honest about my misgivings about my own results. But now that I'm in the system I don't know how we change this system.
I get the sense sometimes there are actual psychopaths amongst us who do mean it, however and they often do rise in the ranks, and those folks are essentially the other nuke holders in this MAD sort of situation. Amongst other things that need structural reform in science, I don't know what it takes to fix it.
> the frequency of hedging words such as “might" and "probably" has fallen by about 40% over the past 2 decades...
Sounds all too right. That certainly seems to be true in the low-end brands of Nature (Nature Food, Nature Water, Nature Joule...) which read like press releases with some random math thrown in.
It's really bad in results associated with "green", "nanotechnology", or batteries.
Interesting. It could be the further "down" on the foodchain your academic discipline, the more certain you might need to appear, or think you should appear -- at least this seems intuitively likely, though that is entirely guesswork and there could be many factors at play.
This strikes a chord with me (not that I'm the only one). Every time I write longform, I intuitively want to include, "maybe", "might", "I think", but then axe them from my writing so as to not invite total dismissal.
In the abstract the authors write "Among the existing studies, there is also divergent understandings." It'd be interesting to see the methodology of this paper (doesn't seem to have access without pay) and how it differs from previous studies.
If you're writing a scientific article with the results of your research, your audience (i.e., reviewers and other readers, usually scientists) will expect you to have evidence for your claims. Any speculation or other "maybe's" should be kept in the sections like discussion or future work, or left out.
If you're writing something where opinions are accepted or even appreciated, like a position paper or a blog post, the audience should be aware that whatever you say is your interpretation anyway. But then, it also applies that some interpretations or guesses are more "educated" than others.
"Maybe" does sound too vague. Although it sounds like a language trick, I prefer to use a disclaimer at the beginning or "assume", "expect", "my understanding", to show that whatever follows is my opinion, but also that I think it's still valid and I'm willing to stand behind it.
This is not limited to science. This is a consequence of people relying on automated grammar and style-checking tools. For example, Grammarly always highlights the words and phrases indicating uncertainty in blue and suggests they be deleted.
I did bother reading the article, and so with my 13 years of theoretical physics research well behind me, I say:
Good. Fantastic. Wunderbar. Magnifique. "Maybe", "might", and "possibly" are all words that should be banned from scientific papers in all sections except "Future Work", and even there the reviewers should be on their guard for authors trying to sneak them in.
I read way too many papers back when I was a researcher where the authors did not have the proper evidence for claims they made, and they fell back on bullshit weasel words like this. It does NOT show that you're humble or retrospective. It shows that you didn't do your job but you really want your pet theory (or more likely, theme of your next grant's research) to be true, so you're going to act like it is anyway but with a safety hatch if you get called out on it.
The real elephant in the room, that the scientific community still refuses to address except when they think they can work it into a grant, is the reproducibility crisis.
I was confused by the article title...the word "some" didn't make sense. Then I finally got the joke. So apparently, I am so used to article titles being fully confident that I subconsciously felt this one was off.
Obligatory "havent read the journal article." But I can say that I remove words like that from my manuscripts to reduce the word count. Older articles I read seem to ramble more and have less information density? So that's another possible explanation?
I think this is a general cultural phenomenon not specific to the sciences- people are told they need to be aggressively confident to be taken seriously. Even as a kid parents and teachers will suggest to kids to edit words expressing uncertainty out of their writing. They are coached to talk like that when public speaking, at job interviews, and even dating.
As a scientist- I also think papers and grant applications are more likely to be accepted if they use an irrationally overconfident tone, at least in the abstract and introduction.
Agree. At least in my western country, it has become almost expected that everyone is fully confident all the time. I’m someone who likes to deliberate on thoughts, take my time, listen to opinions and assess them, then provide my opinion (which may also change over time with more information or changing circumstances). I am humble, a listener, and don’t believe I know everything. I feel a little out of place in current society.
One of my professor at uni (he managed to talk to Bill gates to get Microsoft to fund his startup, which is now an established research center) told us that if we don't know something, being honest and saying you don't know it is usually a good sign. Be it at an oral exam, be it when you're answering question at a startup pitch (...)
I really appreciated when a professor, when asked with a question he doesn't know, stops trying to answer and says something in the lines of "I actually don't know the exact answer, I'll try to find and answer and tell you next lesson"
In a way the school system teach people to fear not knowing something, because at an exam it mean getting a bad grade. At work, it is also often frown upon. This is indeed a problem in science since the goal is precisely to find more about unknowns.
I had another enlightened professor that also founded a startup that talked about how much is it frowned upon to admit to be wrong at work. Bear in mind in his opinion this is especially true for Italian workplaces, but I can imagine it can be somewhat true for other places as well. Knowing why things went wrong is much more useful than knowing who did it and pointing fingers to him/her
>it has become almost expected that everyone is fully confident all the time
I still remember my first discussion with a CEO in my first job as a programmer (not even out of uni yet). My project was delayed because of a buggy installer and I had problems reproducing the issue.
I was am repeatedly asked "how long until the fix" and I kept saying "I don't know, it's a bug, can take 2 hours can take a week".
We both left this conversation frustrated. Though of course for her it was business as usual, and for me it's still awkward and stressful when I think back to it.
Anyway this taught me to confidently state something like "this will take three days" instead of trying to convey uncertainty to the higher management.
These are just characteristics of high intelligence. You would have always felt out of place with society.
I think there’s personality traits at play here also- simply being introverted means you are more focused on if your ideas make sense to you, than how they will be received or what they will make other people think about you.
I think there are intelligent extroverts that just aren’t really that worried about if their opinions are correct or not except in rare cases where the consequences of being wrong would be severe.
When I was younger I had a bad habit of taking what information I did know and then piecing it together to try and fill in what I didn't know, then stating that confidently. Many times I was wrong. In fact, most times I was wrong and not even close. But I was expected to be that way in order for people to listen to me. I was doing this almost until I was twenty because of that expectation.
The one good thing that came out of this maladapted behaviour is that I learned to take my hunch, piece together the information, and then use that to go looking for what I didn't know. In a way it got me addicted to constantly learning, something I never managed in school.
That is very much a normal human condition honestly. In my experience what I’m seeing from time to time is younger folks have way more confidence in things they’re wrong about than I did when I was their age. I had some confidence no doubt but ample room for error where I would run a “trust but verify” pattern on my own bullshit regularly. I learned from who knows what that it’s bon ton to say “I don’t know” so I got into the habit of withholding my theories in favour of yielding to the other parties. Of course it helps in tech and business to be discerning. But I’m seeing less of this and way more of baseless, unfounded conclusions that have not stemmed from deductive reasoning, conflicting research and so on but rather just-so statements and single-dimension emotional reflexes. It’s clear the presenter wholly lacks experience, but we play along because it’s amusing.
The hard part is articulating to someone to slow down because we run the risk of them being offended. But that’s a different thread altogether.
Your account is dead but none of the posts in your history look bad. You can email hn@ycombinator.com and if there has been a mistake they'll sort it out.
Thanks I’m not sure what happened but it seems to be fixed…
Here's an idiot's guide to sounding smart:
- use big words
- speak with utmost confidence
- if you can't convince, confuse
Part of it is the English language doesn't have tools to express uncertainty without adding verbosity. For example, take what I said here, my sentence length increased by 25% by saying "part of it is" to communicate uncertainty. This gets tiring quickly for both the writer and reader. It becomes boilerplate.
That's a good point. As a non-native speaker, I use Grammarly to improve my writing. But often, when I add clauses that indicate uncertainty, grammarly complaints about how I can be more concise by dropping those parts.
> the English language doesn't have tools to express uncertainty without adding verbosity
Well, there's always irony quotes. For example:
The new Apple lineup will "amaze".
[dead]
Fair, we at least have the subjunctive, we just don't use it very often. If I were to start a sentence like this, I would be using the subjunctive.
(Which I believe has been confirmed by Gemini):
> Yes, the sentence "If I were to start a sentence like this, I would be using the subjunctive" uses the subjunctive mood correctly. > The subjunctive mood is used to express hypothetical or unreal situations, wishes, suggestions, or demands. In this sentence, "If I were to start a sentence like this" presents a hypothetical situation, and "were" is the correct form of the verb "to be" in the subjunctive mood."
But yeah, it doesn't get at the uncertainty expressed in other languages. Is there another mood for that or how do other languages handle it?
As one teacher and several profs told me, "You don't need to say, 'I think', because the fact that you are saying or writing it means that you think it". In other words, if you are uncertain, don't express the idea, because no amount of verbal hedging will protect you if you express something that is incorrect.
However, the best way to get the correct idea if you are wrong and unsure is to confidently propose your idea as if it is correct - Cunningham's law applied to everything.
writing 'I think X', 'there a chance that X' instead of 'X is', 'I know that X'
> if you are uncertain, don't express the idea, because no amount of verbal hedging will protect you if you express something that is incorrect.
The problem with this is with the target audience of your writing, if you're assuming an adversarial audience vs a cooperative one. the bigger and less self-filtered the audience is the more likely it is to become of the first kind.
> the best way to get the correct idea if you are wrong and unsure is to confidently propose your idea as if it is correct - Cunningham's law applied to everything.
Personally when I see someone being confidently incorrect consistently I just filter them out and their opinions instead of correcting them because a lot of time it's a lost cause.
for example Reddit and twitter, the sheer amount of people that are overly confidently incorrect, it made conversations not worth having there so I just stopped many years ago.
Everyone has their threshold for how many times you are allowed to be incorrect within a given context, sure.
> adversarial audience... Yes. Even if the audience you target is small and on your side today, a successful piece of writing will gain an adversarial audience, eventually.
> If you give me six lines written by the hand of the most honest of men, I will find something in them which will hang him. - Cardinal Richelieu (apocryphal paraphrase from hearsay apparently but still a great quote).
And yet one of the best ways to lose respect in a technical context, especially as a junior, is to be confidently incorrect. Showing humility helps a hell of a lot for all concerned if you've in fact misunderstood something.
> if you are uncertain, don't express the idea
A better interpretation is to express the uncertainty at its core, and not wrapped into your subjectiveness.
For instance if you read it in a publication and kinda agree but don't fully trust it, instead of "I think X" you could go with "Publication Y exposes the possibility of X"
The real point of the advice is to get rid of fluff, not remove nuance or valuable information.
An explanation some teachers beat into me over the years is that everyone already knows you're only stating a thing because it's your opinion. It's discourteous to waste time repeating something so basic, so such hedging language should only be used when it actually adds something to the conversation -- warning the reader of something with an unexpected level of certainty.
On some level, that makes even more sense in scientific writing. You already have method sections, experimental results, .... If you haven't been overconfident in those sections then the reader really ought to have all the tools they need to appropriately examine the paper.
We lose nuance by only usign maximalist statements. Take these rather typical intro-phrases:
We find [strongest result]
Our findings strongly indicate [likely result]
Our findings suggest/indicate [result that needs more research]
The way I and many other people read papers is that we read method and appendix last, good writing can give you hints at those things without needlessly bloating the paper. The nuance also makes me trust the authors more, as it indicates humbleness and diligence
Huh? People sometimes state things that are the genuine facts, as far as anyone has been able to prove, but that are not entirely possible to be 100% verified.
This isn’t some obscure niche case either, it happens every day on HN.
If we're interpreting what I wrote charitably, it's implicit that "obvious" facts will extremely rarely have hedging language, so the anti-hedging conversation can focus elsewhere.
I _knew_ somebody was going to nitpick that phrase and rejected the urge to write a few extra paragraphs in the format of a HN comment expressing the idea more completely...
If we want to look at it a bit less charitably though, even things that are true "as far as anyone has been able to prove" are usually not perceived to be true for a significant fraction of the human race, so in interactions with randos (like this current thread) it's still reasonable to assume that we're talking about opinions rather than facts. Even when talking about "facts" (as you partially defined them), the core idea still holds -- your _opinion_ (maybe a very well validated opinion) is that those facts are true, and hedging language is discourteous to the reader.
It also happens from the opposite direction.
Seemingly random claims, that no one has been able to provide proof for so far, but the person has some special reasons to strongly believe it is true.
E.g. ‘Back channel’ communication in diplomacy.
Also e.g. MS Word suggests getting rid of qualifiers.
With papers and especially grant applications it's ridiculous that everybody knows they are overpromising but still they don't go through of they don't overpromise.
Before i send something i make sure each paragraph has at least one it of green underline. The day that Word and I agree on grammar is the day i switch back to pencil and paper!
Including HN comments, I see...
> Even as a kid parents and teachers will suggest to kids to edit words expressing uncertainty out of their writing. They are coached to talk like that when public speaking, at job interviews, and even dating.
That's the first time I'm hearing this and it's certainly not something I've ever observed. In this broad generally, I cannot believe that this statement is true.
I'm not sure it's presented as removing expressions of uncertainty.
But modern writing instruction tend to prioritize precise verbs and nouns (choose the most precise word) and takes a very critical eye towards adverbs, adjectives, or prepositional phrases that might simply contribute to cadence, affect, and voice. I find it really hostile to style and readability, but it's easy to evaluate in education because you can just call out the flavoring bits as errors and point to a more precise dictionary word for what's left. Plus, it projects objectivity simply in it being "the most right way" to write, leaving the dirty mess of subjectivity, character, and humanity out of the classroom.
But the relevant end result of all that is that there's very little room for qualifiers or tentatitiveness, as those are usually expressed exactly in those subtle modifiers that people are trained to exclude.
And then this does end up coinciding with a truth of rhetoric, which is that -- much of the time -- the receiver of speech or writing will implicitly accept whatever authority and surety you project. If you sound sure, you must be sure, and if you're so sure, you must have reason to be, so you're probably right.
Put these together, and you get dull, dense technical writing with authoritative, objective tone and people are too overwhelmed and passive to bother parsing it enough challenge it.
Are you american? As a non-american living in the US, the statement from parent post sounds about right to me. Culturally, people sound very confident, even when they have little to no confidence on the content.
This American agrees. I intentionally make my public writings much more confident than my private writings. If I don't, people just ignore me.
I remember a teacher telling me to delete every occurrence of "I think," because it's redundant to tell people you think what you're writing.
That's good advice in that context and speaks to the difference in standards of evidence between normal expository or persuasive writing in school and scientific writing. In a scientific paper if you assert something it's important to flag to the reader what kind of claim this is, eg something demonstrated by other research, something that directly follows from your evidence, or something that could possibly be true given prior knowledge and your evidence.
It was definitely true in my high school English classes ~20 years ago that we would get red ink on our papers if we used what were termed "weasel words" like "suggests" and "believe". I have no idea how representative that was, or which direction the trend has been since then. We were also taught to use short, declarative sentences wherever possible, Hemingway being the ultimate example of the "beauty" of economy in language.
It is my understanding that this is a very USA-centric trend in writing, probably being exported along with the rest of the complex basket of American cultural quirks that come along with US-led internationalization (hegemony if you like) since WWII.
I’m guessing you are not in the Bay Area in the US like much of the people on HN?
I suspect HN is much more diverse than you make it out to be. I see _lots_ of non-US people around.
Not any more, in fact.
I was going to write the same thing as the GP, fwiw.
Look at how many public figures embrace the extreme overconfidence as a mark of tactical genius: SBF, Musk, lots of other SV figures, Trump, etc.
Humbleness, humility used to be admired. Can you imagine someone saying that now?
> Humbleness, humility used to be admired.
They are generally praised in the abstract, but in concrete terms of actual people in terms of attracting a contemporary following (and not distant, historical, and academic respect), boldness has usually trumped humility. The idea that there was a time when people didn't flock to the overcompetent people presenting strong-leader images over people with genuine humility is just one of the many ways people create a mythical past to paint the manifestation of long-present aspects of human culture as novel degeneracy into which the world has recently fallen.
> in concrete terms of actual people in terms of attracting a contemporary following (and not distant, historical, and academic respect), boldness has usually trumped humility
I don't think that's true: Confucious, Jesus, George Washington, Lincoln, MLK ... Eisenhower, every president before Trump, the New England culture of looking down on ostentation and displays of wealth, .... I read a New Yorker article several years ago about a culture of Wall Street leaders in the 1980s who purposely wore cheap watches, had homes with low fences, etc.
> I don't think that's true: Confucious, Jesus,
Jesus had a relatively small band of followers that abandoned him at the first sign of unpopularity with authorities, pissed off more people than followed him, and was murdered by the public authorities at the demand of the local population. He got a bigger posthumous following after (actually or in a myth created by people seeking their own influence) rising from the dead.
Says a lot more about the respect for miracles than respect for humility, however much one might read what is written about him as calling for humility from his followers. Meanwhile the list of historical figures whose rise to influence was fueled by the exact opposite of humility is...not short.
> every president before Trump,
Very many of the Presidents before Trump were...not known for humility in their time. Some have had that added to their popular myth afterwards, but, I mean, plenty not even that, and probably the most recent one that was (I'm...not young, and Carter is probably the only one in my lifetime you might make a case that it was something he was particularly known for.)
I don't want to get into religious debate, but you might want to re-read the Gospels - you are missing the fundamental message of Jesus, which has turned out to be extremely popular - arguably the most popular text and message in the world.
> Very many of the Presidents before Trump were...not known for humility in their time.
Who? What did they do? Nobody like Trump or other people I named.
> I don’t want to get into religious debate, but you might want to re-read the Gospels - you are missing the fundamental message of Jesus, which has turned out to be extremely popular - arguably the most popular text and message in the world.
You might want to reread the upthread context: I was discussing the role of humility “in concrete terms of actual people in terms of attracting a contemporary following” and distinguishing that form backward-looking regard (since we are comparing to how Trump is currently viewed, where we can see only the contemporary regard and not what future generations will think of him in retrospect.)
What has happened after the time of Jesus ministry with respect to the impact of works written about him is irrelevant.
Jesus's message of humility was clear. The rest is your speculation, 2000 years later, about what worked.
If it's so clear and persuasive then why is it more lacking today than 2000 years ago?
What was the attitude toward humility 2000 years ago, and where? I personally have no idea.
Jesus is certainly far, far more influential than 2000 years ago. Christianity was an obscure religion and probably nobody had heard of Jesus outside the immediate region.
Jesus' very clear message of humility is published in a massive best-selling book that got gold filigree letters in the medieval manuscripts and which was promoted in buildings that were for a thousand or so years the largest and grandest that most people ever saw, and led to the creation of the famously wealthy Catholic church (whose stuff was famously confiscated by Henry VIII when he needed more gold) led by popes whose desire for expensive art led to the sale of indulgences that led to Martin Luther and a massive europe-wide series of conflicts, and also led to the less famous but more infamous Knights Templar who didn't survive having their money taken.
Parable of the sower comes to mind: hear the word, but did not follow it.
Except the popularity in that era was due to all the powerful people saying it proved their right to absolute power and sometimes made other religions (including denominations of Christianity) illegal, and popularity waned rapidly as it stopped being required.
Actually reading the bible is a big part of why I switched from Catholic to Wicca as a teen before deciding it was probably all ahistorical anyway.
(I'm not trying to sell Jesus or Christianity; I'm just talking factually about the text of the Gospels. For those unfamilar - the Gospels are the first four books of the New Testament, in which the authors describe Jesus' actual words and life, often quoting Jesus. The rest of the New Testament is other people's responses to about Jesus and Christianity.)
As I understand it: Humans are complicated, as Jesus depicted them. They are both terminally weak and flawed, and there is also good and justice and mercy in them - the many angels of our natures. Jesus ministered to the weak and flawed, not the good and just, certainly not to the perfect. It's a message of love to people who sin, which is pretty much everyone; Jesus didn't expect differently. His message was to love the sinners and for the sinners to accept his love.
All human institutions are flawed and to a degree corrupt; they are run by sinners. I'm not defending the Catholic Church or any church or religion, my point is that if we throw out the good when there are flaws, major flaws, we are left with nothing. Not even ourselves.
[flagged]
What does that say about humility?
Musk is not a figurehead of confidence, and really is nothing like SBF. In fact, he’s regularly very direct about conveying the level of uncertainty when discussing the possibility of RUDs with Starship or being optimistic about deadlines for Tesla when he’s speaking in long form and not taken out of context with a sound bite.
You're rewriting history. We don't need to list all the extreme claims and exaggerations. And his 'short form' communication is not someone else taking him out of context, but how he regularly communicates - on X.
Yes and no.
Often being trapped in indecision is a sign of fear and insecurity rather than calm consideration. But sometimes people go into things headstrong and lose because they didn't make any calculations beforehand. There's a balance, or maybe a dialectic, between thinking and non-thinking action.
I think it’s a newer culture pushed by the media and universities to accept scientific claims without questioning them. They usually cite whatever they say without evidence. What a scientific consensus or even unnamed scientist say can be used to dismiss others’ points. It’s almost like they’re the clergy of today. Some are calling this “scientism.”
We need to go back to treating any claims with healthy skepticism, reviewing them, and phrasing things carefully to show the degree of certainty uncertainty involved. We need to protect the integrity of both science and scientific reporting. That’s better for us in the long-term.
There is a problem too that scientism has a dual meaning.
There is the belief in science as the best means to get at the truth and then the pejorative of using science as a type of orthodox religion. So for the true believer in the pejorative you can't even use the term.
Personally, I think this is a lost cause. Scientism in the pejorative sense is a new religion and most likely we are at the very early stages. Even when it comes to lifting weights, every youtube video pretends to be "based on science".
I might be too old to see it happen but I fully expect at some point in the future instead of a movie getting 5 stars, the commercial will say this is scientifically proven to be great movie. A restaurant will claim to have the best tacos based on science.
This is clearly the road we are on and I don't see a way back. Actual scientist will have to be more and more certain in their language because that is what scientism in the pejorative sense is to the true believer and the cult of the true believer in pejorative scientism is growing.
"SCIENCE IS REAL"
As a scientist this stuff irks me more than the openly anti-science people- because it deprives science of the whole point- to be able to question and understand things yourself.
I dislike how every food item, exercise, etc. is now “scientifically proven to be optimal.” And if you look at the actual research it will show that some biomarker was higher in some comparison… but who is to say having that higher is better, and that they were actually measuring the only relevant factor out of thousands of possibilities to measure? We simply don't know nearly enough about biology to measure one biomarker and conclude that it proves something is "optimal" for the health of every member of an entire species of complicated organisms over their entire lives. If I'm going to lift weights I don't care how it modulated some level of some immunological molecule the authors were studying because it's what they happened to get funded to measure- I just care if when other people tried it if they got stronger, and didn't get hurt... something I can learn from word of mouth rather than peer reviewed literature. And you will get called “anti science” if you question it… as if questioning things that don’t make sense wasn’t the main point of science. /rant
People misinterpret science as being a collection of facts discovered by authority figures rather than a process of questioning the validity of any claims made. It's almost as if the concept of science has turned into the religion of the atheist. Unquestionable, and dictated by more holy figures than them, in the form of scientists, and in some cases politicians, rather than clergy.
I can see how they came to see it this way, since that is how science was taught to me in public school... I only knew better because both of my parents were scientists, and I grew up helping them with their research.
I think that could be changed by changing how science is taught: give people the tools to experiment and reason, and figure out an answer on their own. I'd like to see more kids volunteering to work with real scientists, and see what the work is like firsthand.
These science-as-religion people often attack me online, and in real life when I share my own ideas... and then back down and flip almost to some creepy worship when they figure out I'm an academic PI. No reasoning or evidence can convince them of anything, but credentials instantly do.
I'd like to explain to them- as an officially ordained "science priest" with the degree and job title- that believing things based on authority rather than understanding yourself is the most heinous blasphemy possible. /s
Maybe that's why people are like that, but I think it's more baked into most people than we care to admit. I think part of our human wiring is to have faith. It used to be in deities, now it is in "science!" Humans, most of them anyway, are also wired to follow authority blindly. I really don't think there's any way around it. It's not that people refuse to be skeptical, it's that they are probably incapable of it.
Agree. At least in my western country, it has become almost expected that everyone is fully confident all the time.
I’m someone who likes to deliberate on thoughts, take my time, listen to opinions and assess them, then provide my opinion (which may also change over time with more information or changing circumstances). I am humble, a listener, and don’t believe I know everything. I feel a little out of place in current society.
Wholeheartedly agree. This issue comes up in job interviews and forces people into faking confidence in spite of not knowing. Honest people be damned if they give the impression of anything less than absolute confidence.
Relevant (and enjoyable): https://youtu.be/OV5J6BfToSw
[particularly 13:44~]
As a grad student it was beaten out of me to be honest about my misgivings about my own results. But now that I'm in the system I don't know how we change this system.
I get the sense sometimes there are actual psychopaths amongst us who do mean it, however and they often do rise in the ranks, and those folks are essentially the other nuke holders in this MAD sort of situation. Amongst other things that need structural reform in science, I don't know what it takes to fix it.
> the frequency of hedging words such as “might" and "probably" has fallen by about 40% over the past 2 decades...
Sounds all too right. That certainly seems to be true in the low-end brands of Nature (Nature Food, Nature Water, Nature Joule...) which read like press releases with some random math thrown in.
It's really bad in results associated with "green", "nanotechnology", or batteries.
Interesting. It could be the further "down" on the foodchain your academic discipline, the more certain you might need to appear, or think you should appear -- at least this seems intuitively likely, though that is entirely guesswork and there could be many factors at play.
This strikes a chord with me (not that I'm the only one). Every time I write longform, I intuitively want to include, "maybe", "might", "I think", but then axe them from my writing so as to not invite total dismissal.
In the abstract the authors write "Among the existing studies, there is also divergent understandings." It'd be interesting to see the methodology of this paper (doesn't seem to have access without pay) and how it differs from previous studies.
Depends on what you're writing, I guess.
If you're writing a scientific article with the results of your research, your audience (i.e., reviewers and other readers, usually scientists) will expect you to have evidence for your claims. Any speculation or other "maybe's" should be kept in the sections like discussion or future work, or left out.
If you're writing something where opinions are accepted or even appreciated, like a position paper or a blog post, the audience should be aware that whatever you say is your interpretation anyway. But then, it also applies that some interpretations or guesses are more "educated" than others.
"Maybe" does sound too vague. Although it sounds like a language trick, I prefer to use a disclaimer at the beginning or "assume", "expect", "my understanding", to show that whatever follows is my opinion, but also that I think it's still valid and I'm willing to stand behind it.
This is not limited to science. This is a consequence of people relying on automated grammar and style-checking tools. For example, Grammarly always highlights the words and phrases indicating uncertainty in blue and suggests they be deleted.
I did bother reading the article, and so with my 13 years of theoretical physics research well behind me, I say:
Good. Fantastic. Wunderbar. Magnifique. "Maybe", "might", and "possibly" are all words that should be banned from scientific papers in all sections except "Future Work", and even there the reviewers should be on their guard for authors trying to sneak them in.
I read way too many papers back when I was a researcher where the authors did not have the proper evidence for claims they made, and they fell back on bullshit weasel words like this. It does NOT show that you're humble or retrospective. It shows that you didn't do your job but you really want your pet theory (or more likely, theme of your next grant's research) to be true, so you're going to act like it is anyway but with a safety hatch if you get called out on it.
The real elephant in the room, that the scientific community still refuses to address except when they think they can work it into a grant, is the reproducibility crisis.
I was confused by the article title...the word "some" didn't make sense. Then I finally got the joke. So apparently, I am so used to article titles being fully confident that I subconsciously felt this one was off.
Interesting that this coincides with the lost feature of LLMs that highlighted uncertainty by color.
Obligatory "havent read the journal article." But I can say that I remove words like that from my manuscripts to reduce the word count. Older articles I read seem to ramble more and have less information density? So that's another possible explanation?
That's a good point. Perhaps journal article reviewers fail to realize the importance of those words in the grand scheme.