23 votes

New computer breakthrough: Light-speed unlocked

18 comments

  1. [13]
    OBLIVIATER
    (edited )
    Link
    Does this have any practical application outside of running AI workloads? Should I have any reason to expect to be using a GPU powered by this technology in the next 10 years? In my lifetime? If...

    Does this have any practical application outside of running AI workloads? Should I have any reason to expect to be using a GPU powered by this technology in the next 10 years? In my lifetime?

    If this hype is to be believed doesn't this fundamentally change the landscape of processing? I don't know enough about these types of chips but it's hard not to be skeptical of the claims made in this video. The segment with the CEO of the company who produced these things makes me feel like this is more of an Ad for some AI venture capitalist bait vs any actual groundbreaking invention but that's just my cynicism from watching so many "breakthrough invention" announcement videos before that turn out to be little more than footnotes.

    Edit: The creator of this video sells a course on how to smartly invest in AI startups... This expands my skepticism on the sky high promises made in this video on how this tech will "change the world"

    "Invest With Confidence:
    Identify high-potential opportunities within the semiconductor and AI sectors. Discover expert frameworks to evaluate technology effectively, identifying technical risks, estimating costs, and more."

    13 votes
    1. Amarok
      Link Parent
      Optical systems change compute like electricity changed labor, more than anything since Turing. That's why a couple hundred startups have been chasing this technology for about thirty years now....

      Optical systems change compute like electricity changed labor, more than anything since Turing. That's why a couple hundred startups have been chasing this technology for about thirty years now. Most of them went the vaporware route because their various approaches dead-ended before producing anything that could manage precision computing with light at scale, which is what this company has demonstrated in real hardware at client sites (not trade shows) for the first time. I predict no shortage of three letter agencies lining up for these and buying them out long before they land in commercial data centers, and you can bet your ass these chips will have the toughest export controls ever drafted.

      This particular product is for Ai workloads, which are the most expensive, energy intensive, and rather simple to optimize for in hardware. That's why they targeted that market. GPU workloads are pretty similar, and in fact Ai compliments it, which is why Nvidia's newer cards are using Ai to render 3 out of every 4 frames (and 4x-ing their framerates in the process while running cooler). This technology can apply there, but not with this product. They'd have to make a new card, but haven't got the resources to chase multiple market segments at present. If this tech delivers, they'll probably be able to buy Nvidia in a few years.

      As I understand it, they are ready to begin large scale production (as in, let's build the factory, not the product, that part's already proven). That means they are looking for funding, which is why they took the time to sit down with a popular youtuber who is both a processor designer and into tech stocks. This is absolutely a marketing ploy to attract investor attention on the part of Lightmatter. So, they have a small factory/production capacity, sold this to a handful of select clients (for the low low price of one of each employee's kidneys I'm sure) but cannot make them fast enough to meet the kind of demand that Nvidia can, for example... nor can they produce enough for a regular home consumer market. Yet.

      They also had to cut a number of corners making this first production run chip, because there are many prickly issues integrating electronics with photonics and this is as far as their R&D has gone solving those problems (using cheaper/faster solutions, not great solutions). That mostly comes from the fact that electrons can be stopped or stored and photons cannot be stopped or (at least for now) stored. Nightmares happen when you put those two limitations up against each other. So, this humble 114 TOPS chip with 64 TBPS interconnect is the '1X CD-ROM' version of their first photonic product, and it outclasses everything else on the market by ridiculous margins, kinda like that first CDROM obsoleted the floppy disk.

      They (3 MIT graduates) are hoping an investor sees this and decides to fund their factory. I wouldn't buy the stock yet (because I listen to Felix), but if I were looking to place a bet on photonics this is the best one yet and I'm rather excited to see some real progress take place.

      10 votes
    2. [6]
      creesch
      Link Parent
      That, and the presentation style is more that of a promotional press release or sponsored video. Also I had a look at their latest videos and am seeing a clear pattern that hints at them have...

      The creator of this video sells a course on how to smartly invest in AI startups... This expands my skepticism on the sky high promises made in this video on how this tech will "change the world"

      That, and the presentation style is more that of a promotional press release or sponsored video. Also I had a look at their latest videos and am seeing a clear pattern that hints at them have somewhat of a tendency to slightly overhype things. It is subtle, but it is there when you look closely:

      • New Computer Breakthrough: Light-Speed Unlocked
      • Microchip Breakthrough: World’s First Silicon-Free Processor
      • HUGE Microchip Breakthrough: The Secret Plan of NVIDIA
      • The Next Big Thing in Semiconductors is Finally Here!
      • The Truth about Microsoft Majorana Chip 😬
      • New Chinese GPUs and the Truth about DeepSeek. NVIDIA is out?
      • Computing Breakthrough: New Light-Based Computer Takes Over!
      • This New Computer Chip is Defying the Laws of Physics
      • The Discovery of a New Semiconductor Can Change Everything
      • Next-Gen Computers. The Future Has Never Been Cooler!
      • AI Meets Quantum: Google Breakthrough Explained
      • My Course on Technology and Investing
      • The Beauty of Uncertainty: New Probabilistic Computer Explained
      • The Secret Plan of IBM: New Microchips Explained
      • Deepmind New AI Breakthrough: This is The Future
      • TSMC and Intel Microchip Breakthrough: The Future is Glass

      For those confused, no it isn't subtle at all. The majority of their videos include something about alleged "breakthroughs".

      8 votes
      1. [5]
        Amarok
        Link Parent
        For what it's worth, what you've identified here is a youtube problem, not specific to this channel. Using clickbait titles makes such a massive difference in the views and the like velocity of...

        For what it's worth, what you've identified here is a youtube problem, not specific to this channel. Using clickbait titles makes such a massive difference in the views and the like velocity of any video that every creator feels like they have no choice but to do it. I've even seen creators use crazy video titles that have little to do with the subject the video covers, and they apologize for it in the first minute of the video, explain this is why they do it, and then tell you what their content/channel is really about. They expect that once you've stumbled on to their channel, you'll sub based on their content coverage and not sweat the titles, and that seems to work. This behavior is common even in excellent channels that cover hard science. It's just 'how youtube works' now. I hate it, most of the people doing it also hate it - but if they don't do it, they can drop a couple of decimal places from their video views. It's the old reddit 'catchy title' problem. I'd have edited this title to something more reasonable, but Tildes etiquette means I have to use the author's original title even if it's clearly clickbait.

        That's not to defend or attack any channel - it's just to say that clickbait titles are not a reasonable tell that means that a channel on youtube is some sort of hustle by default.

        3 votes
        1. [4]
          creesch
          Link Parent
          Hard disagree. While there are indeed many channels with titles like this, most of them are light on information and quality. There are many channels managing to do just fine without being over...

          For what it's worth, what you've identified here is a youtube problem, not specific to this channel.

          Hard disagree. While there are indeed many channels with titles like this, most of them are light on information and quality. There are many channels managing to do just fine without being over the top like this for every single video. In my experience the latter type of channels also has better content in regard to sourcing their claims, neutral reporting, etc.

          In combination with the rest of the video it doesn't inspire much confidence in the content creator being much more than someone who just puts press releases in video format without much input otherwise. They might be popular, but that is irrelevant as far as I am concern. Because popularity does not equal quality of content. In fact most channels that prioritize pure growth over anything show that they do in the quality of their content.

          They expect that once you've stumbled on to their channel, you'll sub based on their content coverage and not sweat the titles, and that seems to work.

          As I said, such titles are a red flag already and generally enough for me to steer away from a channel. I made an exception given it was posted here and was only validated in my choice to normally avoid videos with titles like this. So, as far as I am concerned they have not done well on both accounts.

          3 votes
          1. [3]
            Amarok
            Link Parent
            I've been enjoying Anastasi for about a year now and I find her to be tip-top at keeping up with the latest developments in processors. If you've found anyone better at covering bleeding edge...

            I've been enjoying Anastasi for about a year now and I find her to be tip-top at keeping up with the latest developments in processors. If you've found anyone better at covering bleeding edge processor technology I'd love to know. That's enough for me and I don't sweat the format at all. I just want to keep up with the tech. I'd have to unsub from 3/4 of my total channels if I wanted to be a purist and avoid videos that include marketing. Youtube is a business, and business rules apply. When a creator hires a team to do editing or marketing (as most successful channels do), they will hard sell you on using this clickbait format and some will even refuse you as a client if you won't let them manage it like this.

            If you're a creator with millions of subs, congratulations, you've made it and the view numbers from your subscribers alone is enough to bump you in the algorithm. That's when you can move to a more serious format if you choose - but anyone under a million subscribers can't really do it that way if they want to grow.

            4 votes
            1. [2]
              creesch
              Link Parent
              Honest question, are you actually keeping up with the latest developments though? Or, is there a possibility that you might just are being swept in up in a cycle of potentials and (over) hyped...

              Honest question, are you actually keeping up with the latest developments though? Or, is there a possibility that you might just are being swept in up in a cycle of potentials and (over) hyped things? Because that is very much the impression I am getting with this specific channel.

              To be clear, I get it. It is really cool to see promising technology as soon as possible and speculate about its potential impact. But a lot of these things never actually reach the market or have the impact companies claim they will. I don’t feel the need to seek out these grandstanding proclamations from tech companies, almost verbatim, copied by YouTube channels either.
              In my opinion, it doesn't make you more informed about technology like this. The actual impact and potential will only become visible once products like these make it into the hands other than the marketing department. Specifically third parties who can validate a lot of the claims. That is the first time anyone can start making actual claims about products like this as far as I am concerned.

              Which means that I also think that a channel where around 90% of the video titles claim to be about something revolutionary is, as far as I am concerned, just selling hot air peddling hypes with no proven substance.

              If you've found anyone better at covering bleeding edge processor technology I'd love to know.

              As I said, I think what I consider actual bleeding edge technology is slightly different from what you view it to be. But, I also don't follow a single source of information. Certainly not YouTube channels chasing views over content, as I said in my previous comment. I do follow variety of tech outlets, some of them on YouTube but most of my information comes from old-fashioned written text. News articles, blog posts, etc.

              And again, I don't feel the need to be on top as I can possibly be. I have done that in the past, for me, it didn't help me in having an actual better understanding of the technology. Neither did I feel like it gave me an edge, rather the opposite as I already said. There is so much technology announced that seems poised to change the world or a specific industry and a majority never does. Or when it does, it is so many years later that any early announcement you might have seen about it is no longer relevant in the slightest.

              2 votes
              1. Amarok
                Link Parent
                That's exactly it. I'm not interesting in learning about deep processor mechanics so I can design chips. I'm also not interested in silicon - that's a dying paradigm, small refinements are all...

                It is really cool to see promising technology as soon as possible and speculate about its potential impact.

                That's exactly it. I'm not interesting in learning about deep processor mechanics so I can design chips. I'm also not interested in silicon - that's a dying paradigm, small refinements are all that's left. Pure facts. Photonics will replace most silicon in the future just because it's faster and cheaper... once we figure out how. The value to me here - and I think you're overlooking this a bit - is that this video does provide a solid layman-friendly technical explanation of the optical technology. Find me any other place to see this sort of photonics coverage on the entire internet, other than company websites.

                She covers every company making tech like this, there's no bias except in favor of the tech itself - and I share that bias thanks to reading Popular Mechanics in the 80s, just like my fascination with nuclear power. I share that stuff here too. As her reach improves, she can start doing more interviews, be taken more seriously, get more live interviews, factory tours, etc. She's got a seven year old channel and 250k subscribers which is not bad considering how ridiculously esoteric the content is - and she has no competition for covering this niche on youtube or any other site, far as I can see. The only deep photonics on youtube is her, all the rest are just advertisements by companies and the occasional videos of some engineers having a good time in a lab. It's a 'take what I can get' situation. I could read scientific journals but that takes longer, requires massive domain knowledge, and costs money.

                I'm hoping that she can ditch the existing format this year and level up into something more serious, and she's taking advice in the comments all the time. The first 200k subscribers is the hardest, a million usually comes fast after that. This is not (at least to me) some silly photonics-hype scam channel. This is a chip designer who shares her love of computing and does her best (with the added challenge of English being a second language) to explain it all.

                Anyway, I'm done with this topic. This is a good illustration of why I don't make many top level submissions to Tildes. If I want to have to defend every single thing I post, I can do it on hackernews or reddit like everyone else. More likely I'll just stop posting anywhere, it's rarely worth the time anymore.

                2 votes
    3. [2]
      arghdos
      Link Parent
      There’s a million dataflow processors out there that haven’t won much marketshare, for the very simple reason that they’re (generally) a nightmare to program. If it doesn’t run x86/ARM/CUDA,...

      There’s a million dataflow processors out there that haven’t won much marketshare, for the very simple reason that they’re (generally) a nightmare to program. If it doesn’t run x86/ARM/CUDA, there’s a huge SW ecosystem disadvantage to overcome, even assuming the HW isn’t complete dogwater to write for

      3 votes
      1. Amarok
        Link Parent
        These already run NanoGPT without any retraining and without any model retooling. Their hardware is compatible with existing Ai workloads and models. You're right about these not being compatible...

        These already run NanoGPT without any retraining and without any model retooling. Their hardware is compatible with existing Ai workloads and models. You're right about these not being compatible with CPUs, though. In fact they probably won't replace a CPU as the 'brain'. Light is bad at logic because physics. Operations we take for granted like AND or XOR just don't work in optics, but math does. They might replace all of the arithmetic done by CPUs, kinda like the old math co-processor. We may be bringing that back. Might even be possible to pop a couple of optical cores into every regular CPU if they work out the conversion issues, but that's still going to be years down the road. There's far more money in AI right now so the CPU is a low priority target.

        5 votes
    4. [3]
      chili-man
      Link Parent
      Yeah, I'm having trouble finding anything that isn't marketing about this company's design. For CPUs, the point early in the video of "nearly 32b precision" (that is, not even 32b of precision)...

      Yeah, I'm having trouble finding anything that isn't marketing about this company's design.

      For CPUs, the point early in the video of "nearly 32b precision" (that is, not even 32b of precision) seems to rule this out for general purpose applications.

      Even for this AI workload, I'm a little skeptical. I would want to see a couple things: first, whether after precision loss, the models are still useful. ML researchers sweat blood to improve accuracy for a reason, throwing that out because of the hardware platform doesn't make much sense. Second, if we are willing to accept lower accuracy, how does it compare to running a model that had the lower accuracy (= smaller model) in the first place?

      If anyone found anything solid about this I'd be interested to see it, but analog computing hasn't been a good idea all these years, so the bar for it to be the right answer now is high.

      2 votes
      1. Greg
        Link Parent
        To generalise very, very broadly, the current approach tends to get better results by trading off numerical precision for more parameters, at least up to a point. For things like LLMs you’re much...

        ML researchers sweat blood to improve accuracy for a reason, throwing that out because of the hardware platform doesn't make much sense.

        To generalise very, very broadly, the current approach tends to get better results by trading off numerical precision for more parameters, at least up to a point. For things like LLMs you’re much more likely to be running at 16 bit precision or lower even on existing hardware, so if they’re comfortably above that I don’t see it being a limitation for a lot of workloads.

        I don’t know enough about chip design or photonics in general to assess their claims more broadly, but for the models most people are familiar with there shouldn’t be any quality loss, at least.

        3 votes
      2. Amarok
        Link Parent
        I admit to a bit of skepticism as well (too excellent to be true usually is), but I doubt we'll have long to wait before internet detectives compile a list of what every employee had for lunch if...

        I admit to a bit of skepticism as well (too excellent to be true usually is), but I doubt we'll have long to wait before internet detectives compile a list of what every employee had for lunch if this news goes out bigly. I will say that they approach they describe is pretty interesting and I've never come across it before. We can of course take a look at their patents but frankly, this level of chip design is above my pay grade so I'm not going to be much use there.

        What I'd love to see is a couple of hardcore tech guys get their hands on this and put it through its paces in the lab. The lack of a 'demo unit' being reviewed independently is a red flag in my book, and it's the next logical step if they want real investing attention.

        However, one other thing I've noticed is that there are literally zero photonics topics out there that don't instantly get dragged down by 500 comments saying it's vaporware. It's stunningly ubiquitous and that makes no sense to me. It's about as intelligent as the 'fusion is always fifty years away' crowd. Photonics has been thirty years away for about thirty years by my count, so perhaps it's time. /shrug

        2 votes
  2. [3]
    Amarok
    (edited )
    Link
    This is a great watch, Anastasi is a subject matter expert. I did not expect the first line of analog photonic chips to hit the ground with a 1000x speed improvement over silicon (and that's the...

    This is a great watch, Anastasi is a subject matter expert.

    I did not expect the first line of analog photonic chips to hit the ground with a 1000x speed improvement over silicon (and that's the least of the news here). I've gone from wondering where I'm going to put my data center's nuclear reactor to wondering where I'm going to put the giant space heaters to keep the ice off the data center floor during winter months. This is excellent news for people concerned about the power consumption and climate consequences of all this compute we're building.

    This is also how you get GPT-Omega running on a $300 card in your workstation while answering questions fast as you ask them.

    6 votes
    1. [2]
      balooga
      Link Parent
      I wonder if the Stargate Project data center is going to be full of these things. That would absolutely shake things up…

      I wonder if the Stargate Project data center is going to be full of these things. That would absolutely shake things up…

      1 vote
      1. mrl515
        Link Parent
        Being tangentially involved with it I am afraid to say it will not, but it's certainly something folks are thinking about

        Being tangentially involved with it I am afraid to say it will not, but it's certainly something folks are thinking about

        2 votes
  3. [2]
    Omnicrola
    Link
    This was a really interesting explainer, even though I'm still not clear on exactly how the photonic chip works. That's on me to go learn more though. It's also really refreshing to see a...

    This was a really interesting explainer, even though I'm still not clear on exactly how the photonic chip works. That's on me to go learn more though. It's also really refreshing to see a video/article hyping a technology that's actually real and being deployed NOW and not either entirely theoretical or useful only in a lab.

    3 votes
    1. jredd23
      Link Parent
      Am glad am not the only one, I came to the comments to get a better understanding of it. Hopefully someone who is on the ground may be able to explain it. My next to do, rtfm.

      Am glad am not the only one, I came to the comments to get a better understanding of it. Hopefully someone who is on the ground may be able to explain it. My next to do, rtfm.

      1 vote