jmward01 6 hours ago

This is not the article's topic, but the title immediately made me think of a cool data technique where you trace noise backwards to determine causality. If a -> b then the noise in a should be in b.

  • Tachyooon 4 hours ago

    Sounds like a fun topic to dig into. Do you have any papers or books you'd recommend?

    • jmward01 4 hours ago

      I have no idea who/when/where the 'original' idea came from. I stumbled on it when I was thinking about history and trying to tell how languages or artifacts influenced one another. I remember reading (don't know where) about linguists creating timelines based on when features appeared and capturing the version of that feature at the time it was introduced compared to how it later evolved. Again, a long time ago so no real definitive answers there. A quick search brought this paper up.[1] I just skimmed it and it looks like it has the core idea in it but no promises.

      [1] Causal Inference from Noise https://onlinelibrary.wiley.com/doi/pdf/10.1111/nous.12300

      • Tachyooon an hour ago

        Thanks! I'll go give it a look.

silexia 6 hours ago

Prices are nearly always more accurate than pundits... This is why free markets have worked so well. We need to cut government intervention and interference and allow them to work properly again.

  • kruffalon 4 hours ago

    Again?

    When and where did there exist free markets?

PaulRobinson 6 hours ago

Not read the whole thing as there is paywall involved, but I have a broad take on what I have read.

I say this as somebody with a hobby obsession with trading on sports betting exchanges, which I've been doing on and off for 20+ years.

In high school onwards, most of us were taught a great deal about calculus, and not a great deal about probability. That's because for many decades working out ballistics was a more useful skill to teach young engineers than understanding how to interpret the statistics of a pandemic, for example.

The rising interest in probabilities in recent years has sat at a weird intersection: real World events that surprise us as being "unlikely"; people questioning the validity of scientific trials using illogical arguments on social media; the legalisation of sports betting markets in the US; and the prevalence of probabilistic and stochastic methods in modern technologies from RL to LLMs.

But, here's the thing: most people are awful at it. And most people are going into prediction markets (and sports betting markets), thinking they know something others don't, with all the logical and calculated thought of an anti-vaxxer who does not understand terms like "sensitivity" and "specificity".

Signal is not noise. Noise is not signal. Yes, the guy on CNN is wrong, just as wrong as the guy on Fox News, but it doesn't mean expertise is dead and gut instinct by amateurs is winning by showing superiority of the Wisdom of the Crowds.

Look, for example, at the last US Presidential election. The markets didn't agree with the polls by a long way, everyone assumed the players were corrupt (moving the line helps move the conversation in the media), or idiots.

Turned out, it was a guy with a smart idea to figure out things experts refused to figure out: shy Trump voters. He commissioned polls that rather than asking people who they would vote for, asked them who they thought most of their neighbours would vote for. Turns out, that's a way more accurate technique. He did some maths, pulled up some spreadsheets or notebooks, throw some Bayesian analysis at it, and realised the main polls and prediction markets were out, so throw some money at it. And then his government (the French), said he couldn't have the money, but that's another story.

The point I think I want to make is that this is an interesting and fascinating area to dive into, but almost everything I've read about it online is shallow, nonsensical, illogical and often wrong. From the intro I'm not sure this is any different. YMMV. But yeah, dive in, it's fun playing with this probability stuff in real World scenarios.

  • michael_j_x 4 hours ago

    what guys is this? Sounds like an interesting read

CGMthrowaway 6 hours ago

The old distinction between “signal” (valuable content) and “noise” has collapsed. Today the noise is the product, because noise keeps the algorithm running - it helps a platform compute our own individual feed-bubbles.

Platforms don’t actually curate content, they curate engagement rate. Therefore the optimal strategy for a platform is to produce as much noise as possible and sort it in real time.

We experience this as “I can’t find the good stuff anymore” while the platform continues to profit as long as we keep looking for the good stuff.

  • atoav 5 hours ago

    I don't think it has collapsed.

    If you read early works that expanded the topic from it's original telegraphy/telephony context (e.g. Cybernetics by Norbert Wiener) it is pretty clear that signal and noise was always subjective. Or let's phrase it differently: One persons noise could be another persons signal. Whether something something is more signal-like or more noise-like depends entirely on who is looking for which purpose.

    As for signal and noise in social media: Lets say you follow 5 people. If we realistically assume you are not going to like everything those 5 people post, that means your feed will contain significant amounts of noise even wothout any algorithmic curration or non-follower content. Part of that will be that you won't even know yourself what you're looking for at all times.