Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
IT

Developer Runs Doom On $50 Apple Lightning To HDMI Adapter 90

A developer has successfully run the classic video game Doom on Apple's $50 Lightning to HDMI adapter, exploiting the device's built-in system-on-chip that runs a simplified iOS version.
This discussion has been archived. No new comments can be posted.

Developer Runs Doom On $50 Apple Lightning To HDMI Adapter

Comments Filter:
  • by zuckie13 ( 1334005 ) on Wednesday February 05, 2025 @11:12AM (#65143921)

    The iOS SOC in that adapter probably has more capability than the whole desktops that ran it initially, so sure why not.

    • Re: (Score:2, Troll)

      I guess my question is... why the fuck do we need a chip like that in an hdmi adapter
      • by wed128 ( 722152 )
        My guess is that the process of negotiation and conversion for lightning and HDMI is computationally expensive enough, and the chip is inexpensive enough, that it's not worth it to use a lesser processor.
        • Nope. Apple made a custom asic to make the lives of third parties harder
          • Apple made a custom asic to make the lives of third parties harder

            Well, also because Lightning port dates from antiquity (at least in computing terms), and thus doesn't have any capability to carry any display signal at all (as opposed to USB-C in Display Port Altmode, or to MHL-enabled microUSB connector).
            The only way to make an iShiny device display on a screen is to have the device "kind-of-sort-of-AirPlay" its screen to the adapter. And inside the adapter have a miniature computer with an actual HDMI-out run a streaming video player.

            But yes, they did it in the most an

            • also because Lightning port dates from antiquity (at least in computing terms), and thus doesn't have any capability to carry any display signal at all

              That's not why Lightning can't do those things. It's because it would have been incompatible with Apple's DRM chip in the cable, which is the reason why Lightning is crap in general. As you note, you could already do it with Micro when Lightning was announced. What most people don't know is that Type C was also brought to the USB-IF before any devices with Lightning were released... And Apple is a member of the USB-IF.

              • also because Lightning port dates from antiquity (at least in computing terms), and thus doesn't have any capability to carry any display signal at all

                That's not why Lightning can't do those things. It's because it would have been incompatible with Apple's DRM chip in the cable, which is the reason why Lightning is crap in general. As you note, you could already do it with Micro when Lightning was announced. What most people don't know is that Type C was also brought to the USB-IF before any devices with Lightning were released... And Apple is a member of the USB-IF.

                But USB-C wasn't even Ratified when all this was going on. Apple was under the gun by the EU to adopt microUSB (insert Eyeroll!) for their "Charging Cable Standard". Because Compliance was allowed with an Adapter Cable, and because Apple was looking to ditch its 30-pin "Dock Connector" anyway, they Developed Lightning, and shipped iPhones and iPads in the EU with a lightning to USB Adapter Cable.

                BTW, Apple freely contributed its Direction-Agnostic feature from Lightning to the USB-IF (ever wonder why there

                • Apple freely contributed its Direction-Agnostic feature from Lightning to the USB-IF (ever wonder why there were no Patent Challenges to that Clearly Patented Feature?), or we'd all still be flipping our USB-C Connectors! Consider that, every single time you plug in a USB-C or Lightning Cable. . .

                  It's not relevant because USB does it in a different way.

                  Every time I plug in a lightning cable, what I think of is that my work iPhone charges with a Type C to lightning cable, and the Lightning end is the end that's having problems.

                • BTW, Apple freely contributed its Direction-Agnostic feature from Lightning to the USB-IF

                  Lightning's and USB-C's reversibility don't absolutely work the same way.

                  Lightning cable have the exact same pins mirrored on both side of the connector, so no matter how you plugged in, the receptacle still sees the same signal at the same spot. It 100% entirely relies on rotational symetry to achieve its goals, and a device can works by only really using pins from one side.

                  USB-C has a some pins which are symmetrical (power pins and USB2 data pins if in USB mode) so it could work either way, but not all of

              • by DrYak ( 748999 )

                It's because it would have been incompatible with Apple's DRM chip in the cable,

                What has the chip that confirms it's an original Apple cable has todo with the signal that carried over the cable?

                I mean in theory you could imagin a "Lightning 2.0" standard, that still uses the same pin out, that still uses the same chip to confirm it's an Apple approved cables, but is physically capable to carry much higher bandwidth signal and has a way to negociate to send HDMI's TMDS down those new fast data lanes instead of USB.
                And in fact, Apple has done something similar, with some of the later iPa

          • Nope. Apple made a custom asic to make the lives of third parties harder

            And somebody Ported DOOM to an ASIC?!?

        • by DrYak ( 748999 ) on Wednesday February 05, 2025 @05:09PM (#65145183) Homepage

          My guess is that the process of negotiation and conversion for lightning and HDMI is computationally expensive enough,

          You're on the right track but the details are even worse than that.

          Unlike USB-C (which in "Alt mode" can map Display Port to some of the high speed lanes, instead of USB3), and micro USB (which can be used in "MHL" mode, i.e., the entire port speaks HDMI instead of USB2 and can be connected to a screen's regular HDMI using a special MHL cable), Apple's Lightning doesn't have any capability to carry an actual display signal. At all. Zilch. Nada. No HDMI, No DisplayPost.

          So the only way they could do an HDMI adapter is by streaming compressed videos to the adapter and the adapter itself generating its own HDMI display signal.
          The same kind of setup if you would be sharing your screen over Wifi with AirPlay to another Apple computer, except they communicate over USB instead of Wifi.
          So basically you have a kind of miniature very-stripped down "apple-tv-like" device inside the adapter, and the Apple iShiny streams compressed video to it.

          See this source [engadget.com], though they say it doesn't run a stripped down iOS. Though technically they're right as that other source [hackaday.com] mentions: it only runs a boot loader, and the "ultra-stripped-down" system that runs in there is sent each time when initializing the adapter (which would allow to improve video quality and reduce artifacting as software improves).

          Given that Doom could intially run on a 386 and has even been ported to Raspberry Pico [github.com] someone was bound to find a way to cram it into an adapter that has an entire mini computer designed for streaming video (it would almost sound luxurious on the scale of other micro devices Doom has been crammed into).

          The magic which is new to this accomplishment is finding a way to send the Doom image to run instead of the stripped down iOS that the iShiny would be normally sending, despite all the DRM shenigans that Apple pulls.
          According to the author, it's very similar to hacking an iOS device of the same vintage to (can't link to the Youtube comment, so I am copy pasting verbatim nyan_satan's explanation):

          @nyan_satan:

          @ Yea, the dongle's firmware is super stripped-down iOS, basically

          There is SecureROM, iBoot and XNU as a kernel - just like some iPhone or iPad of that era (now is the same, but obviously they did a lot of development since then)

          Production firmware's userspace is ultra-minimalistic though - there's a ramdisk, but it's not even a filesystem, but a statically compiled Mach-O (it's like ELF, but for Apple *OS)

          Internal development bundles do have a proper ramdisk with filesystem and a bunch of executables/shared libraries on it

          The Mac here just loads such firmware into it, since the dongle doesn't have any persistent storage. The colorful logs going in one of the terminals are UART output from it - first iBoot and then kernel and userspace

          Arbitrary code execution is achieved due to iOS-world bootrom exploit - checkm8, which also works here because codebase is literally the same

      • by edwdig ( 47888 )

        Doom ran on a 386, which released in 1985. There's not much value in mass producing chips weaker than that today.

        • Crawled, more like. It only ran on a 486 DX.

          • Crawled, more like. It only ran on a 486 DX.

            It ran on a Super Nintendo with a co-processor that definitely was not a 486 DX.

            • SNES Doom is a quite different game, that utilizes textures and most of the levels of Doom. It uses a different engine that lacks a lot of graphics features (flats, transparent textures), has monster textures for only one side rather than eight, lacks most of monster AI, etc. And even with that limitations, it shows only a small window with halved resolution.

              • And what version of the game do you think is running on this SoC?

                • Probably the dos version since it makes very little assumptions about the hardware present. SNES is the exact opposite.

                  • I'd guess it's more probably a port of Linux Doom, since that's designed for a more similar environment to what's running on this device, and it was still both as simple as possible and a CPU-rendered, framebuffer-drawn game.

                    I ran Linux Doom on a 386DX25 with 8MB RAM and a relatively speedy 1MB Trident VGA card. It worked OK... at 320x200.

                    I ran Quake on a 486DX25 with 16MB, too. It worked, not great, but OK... at the same resolution. Doom would work smoothly at up to 512x384 or so.

          • by ed1park ( 100777 )

            Agreed. I remember lurking on BBS forums on my 386 reading about how we would all need to upgrade to a 486 with a whopping 4MB of RAM to play it properly! One of the most anticipated games ever until Quake in those days. Ah the memories.

          • by alvian ( 6203170 )
            It ran fine on my 486 SX
          • Doom ran fine on a 386 [reddit.com].

      • There is insufficient bandwidth for full HDMI. So the iOS device compresses output video and the HDMI adapter decompresses it.
      • by tlhIngan ( 30335 )

        I guess my question is... why the fuck do we need a chip like that in an hdmi adapter

        The protocol for video over lightning requires compressing the video down, so inside the adapter is a little Apple SoC that runs iOS on it. Its primary purpose is to decompress the video being sent to it over Lightning.

        Lightning didn't have enough bandwidth to carry video so that's what they did to support it. If the content is say, app content then there's significant black space that simply doesn't have to be sent so the

        • I guess my question is... why the fuck do we need a chip like that in an hdmi adapter

          The protocol for video over lightning requires compressing the video down, so inside the adapter is a little Apple SoC that runs iOS on it. Its primary purpose is to decompress the video being sent to it over Lightning.

          Lightning didn't have enough bandwidth to carry video so that's what they did to support it. If the content is say, app content then there's significant black space that simply doesn't have to be sent so the video data can carry more app content, and the adapter handles the necessary scaling/stretching and black bars.

          Of course, one needs to realize it was introduced in 2012, when USB2 was basically the fastest interface around, and lightning didn't offer much more than that in bandwidth. It's possible the lightning altmodes inspired the USB-C altmode as well.

          Very possible, since Apple was on the USB-IF Committee, and they made other significant contributions to the USB-C Standard; most notably, Lightning's Patented Direction-Agnostic Connection Scheme and Hardware Design; which is exactly why we don't have to Flip our USB-C (and Lightning) Connectors Three Times. . .

      • I guess my question is... why the fuck do we need a chip like that in an hdmi adapter

        Fast-to-Market.

        They already had the software done and dusted for the same, or very similar, SoC. Plus, they were probably already ordering plenty of that particular SoC for something(s) else; so diverting a few percent to this low-volume accessory was obviously no problem.

        Add to that the much faster Engineering Turnaround, and it makes perfect sense.

        But I agree, it is still kinda amusing!

  • Pretty sure I need some controlled supply lines and a way to validate code signatures.
    • and a way to validate code signatures.

      Yes, according to a comment from the author, the magic is hacking around the DRM, and is achieved using exploits for iOS of the same vintage, because that embed computer inside the adapter runs(*) a very stripped down version of iOS.

      (*): Well technically it doesn't have any local storage, so in practice the adapters waits for that stripepd-down iOS to be sent from the iShiny device each time it is initialised (normally allowing for better image quality as the software improves, or sending a hacked firmware

  • by Petersko ( 564140 ) on Wednesday February 05, 2025 @11:14AM (#65143927)

    I looked it up. You can't upgrade the RAM. Or anything else for that matter. Why anybody would lock themselves in like that is beyond me.

    • Well now you've ruined my day. If I can't run FS2024 on it what's the point......

    • Re:Typical Apple. (Score:4, Interesting)

      by OrangeTide ( 124937 ) on Wednesday February 05, 2025 @11:35AM (#65144011) Homepage Journal

      put Linux on it. Then you can run ZRAM to increase available memory and a swap drive over the USB connector. Basically infinite amount of address space in a tiny little box.

    • by mjwx ( 966435 )

      I looked it up. You can't upgrade the RAM. Or anything else for that matter. Why anybody would lock themselves in like that is beyond me.

      I'd ask why a display port converter needs RAM at all... let alone enough processing power that it can run a 486 game.

      • Think of it as being nearly firmware-free. Since it is loaded externally on connection, it always has the compatible version. It's not just shifting the video like for like. The devices have different resolutions and aspect ratios.

  • These have gotten lame now. Any Turing machine runs doom. And no, I don't want to see doom running on tape moving machine. How about doing something else for a change? I like the adaptor hacking part, but then running Doom on it is like whatever. Don't get me wrong I loved Doom back in 1993 or whenever but did we really need to keep rehashing Doom and only Doom? It had a great contribution to gaming and 3D but how about maybe run a demo of Maze war and then show some video game evolution all the way up to a

    • Doing it with a decent framerate at least demonstrates some decent MIPS and I/O capabilities that aren't addressed in the Turning Machine model of computing. That's the problem with a TM. It can't tell you the time of day, because execution time is a side effect that lives outside of that theoretical model.

      • by narcc ( 412956 )

        decent MIPS and I/O capabilities that aren't addressed in the Turning Machine model of computing

        Er, you might want to dig out your old notes. Or borrow someone else's.

        That's the problem with a TM. It can't tell you the time of day

        Don't be so sure [youtube.com]

        because execution time is a side effect that lives outside of that theoretical model

        Hmm... Maybe skip the notes and dust off your old text book.

        • Cute video. not actually using Turing's description. But that's fine, it's enjoyable at least.
          It's was extremely useful for generalizing computational theory and algorithms and in demonstrating the halting problem.

          You might want to check your own notes on what an algorithm is.

          • by narcc ( 412956 )

            You might want to check your own notes on what an algorithm is.

            Where do you believe I've gone wrong?

            • I mean, his username seems to indicate he's a support of a particular political person... and we all know how those people are like... ;)
              • I really must have been an impossibly early adopter of the Orange King's cult to have such a low uid.

        • by wed128 ( 722152 )
          That video is impressive, but if it keeps accurate time (there's no audio description, and the video isn't long enough to see), then either it's receiving an external time signal (not part of the turing model) or it's dependent on the GOL step rate (also not part of the turing model)
          • by tepples ( 727027 )

            The existence of busy beaver machines shows that step rate is part of the Turing model.

    • Thought I suspect you wouldn't know where to start.

    • by Rinnon ( 1474161 )
      Yeah, I too remember when the internet was full of possibilities. Now it's full of memes. Trying to suggest memers "do something more productive" though is a bit... well you get the idea.
    • Re:So what? (Score:5, Informative)

      by ClickOnThis ( 137803 ) on Wednesday February 05, 2025 @11:46AM (#65144051) Journal

      The astounding thing about a waltzing bear is not how gracefully it waltzes but that it waltzes at all. - I think this is from Robert A. Heinlein's Time Enough for Love.

      It seems Doom on obscure platform [whatever] has become the new waltzing bear. Yeah, lame in terms of utility, but good for geek creds.

      Now I'd like to see Doom running as an Excel spreadsheet. Oh, wait. [gamedeveloper.com]

      • Pfff. Excel is Turing complete so that doesn't impress me. What impresses me is now when you wife tells you that you're Doomed for the next 18 years, you don't know if it is because her pregnancy test was positive, or if she was actually simply playing Doom on it. https://www.youtube.com/watch?... [youtube.com]

    • Re:So what? (Score:4, Interesting)

      by smoot123 ( 1027084 ) on Wednesday February 05, 2025 @11:51AM (#65144057)

      These have gotten lame now. Any Turing machine runs doom.

      Well, yes, but. I'm continuously astounded (a) what everyday items have embedded SoC's on them and (b) how creative people get reprogramming those controllers.

      I'm also gobsmacked the easiest way to make the dongle work was to produce a stripped down iOS. I mean, I get it, you've got the OS source just sitting there in your local repo and perhaps sharing key exchange code would be handy. But surely a video dongle could just run some code on the bare silicon in real mode. Why does it need an OS at all?

      • by GoTeam ( 5042081 )

        Why does it need an OS at all?

        So the "right" group can slip something in there if needed... but it'll just be used against the bad guys. For real this time!

        • Why does it need an OS at all?

          So the "right" group can slip something in there if needed... but it'll just be used against the bad guys. For real this time!

          That SoC doesn't even have a ROM. The firmware is loaded every time it's plugged in.

          And wtf would an iPhone-HDMI dongle be an attack vector for, like you couldn't just sneak a SoC into a generic charger adapter instead.

          • That SoC doesn't even have a ROM. The firmware is loaded every time it's plugged in.

            So, where's the bootloader? Does the host RDMA the code to memory and use some weird signal to reset/start the processor?

      • by Anonymous Coward

        >Why does it need an OS at all?
        It doesn't, and my cynical opinion, it's probably because whoever did the design at Apple simply didn't know how to write code for an embedded processor without an operating system.
        Think of how many developers design docker images for cloud services that contain bloated operating systems to run a microservice. Or node projects with dozens of dependencies for trivial functions. Or how many "desktop apps" are just bloated 100MB-300MB electron webapps.
        If it works acceptably, i

      • These have gotten lame now. Any Turing machine runs doom.

        Well, yes, but. I'm continuously astounded (a) what everyday items have embedded SoC's on them and (b) how creative people get reprogramming those controllers.

        I'm also gobsmacked the easiest way to make the dongle work was to produce a stripped down iOS. I mean, I get it, you've got the OS source just sitting there in your local repo and perhaps sharing key exchange code would be handy. But surely a video dongle could just run some code on the bare silicon in real mode. Why does it need an OS at all?

        It's not an OS by any definition, the firmware is just a kernel that runs a single process, not even a filesystem. The dev firmware apparently has a little filesystem with some debug utilities, that's what this guy used.

        https://panic.com/blog/the-lig... [panic.com]

        • It's not an OS by any definition, the firmware is just a kernel that runs a single process, not even a filesystem.

          Which all sounds more and more reasonable. I just wonder what services the rump OS provides. Memory management? Interrupts? Threading? Device drives? Encryption/decryption? I find it hard you need any of that.

          About the only think I can think you'd want is initializing the hardware. There's often an annoying amount of gobbledygook you need to do to get the processor running, even in real mode.

    • by Tarlus ( 1000874 )

      I think the bigger takeaway is the absurdity of an HDMI cable requiring an embedded SOC

      • the absurdity of an HDMI cable requiring an embedded SOC

        It comes from how shitty Lightning is. There's no way to output an actual display signal on it. No DisplayPort nor HDMI (as oposed to, say, a USB-C in DP AltMode or a MHL-capable microUSB).

        The only way to get a screen output out of an iShiny device is to "sort-of-kind-of-AirPlay" the display over to the adapter, where a tiny stripped down computer runs a video stream player to generate the actual HDMI signal.

    • The real question is - will it run Crysis
    • Blasphemy
    • by suutar ( 1860506 )

      Running Doom is the 'hello world' of graphical apps now

  • by Viol8 ( 599362 ) on Wednesday February 05, 2025 @11:39AM (#65144025) Homepage

    Just to convert from one serial protocol to another? Talk about overkill.

  • ... errrm ... nevermind.

  • by JudgeFurious ( 455868 ) on Wednesday February 05, 2025 @12:43PM (#65144223)

    The older I get the more convinced I am that one day I will see someone running Doom on a hearing aid.

  • by paradigm82 ( 959074 ) on Wednesday February 05, 2025 @03:21PM (#65144789)
    Doom also runs on a Raspberry Pico - the MCU costs $1 and a board with flash etc. can easily be had for $3. That buys you a board with dual-core 133 MHz CPU (+ PIO helper "cores"), 2M flash, RAM. There is not even a dedicated graphics card, but the CPU cores are fast enough that they can bit-bang the video and sound signals: https://www.youtube.com/watch?... [youtube.com]
    • a board with dual-core 133 MHz CPU (+ PIO helper "cores") {...} but the CPU cores are fast enough that they can bit-bang the video and sound signals:

      Well according to the author, it's not the CPU directly bitbanging, but the one of the PIO state-machine cores doing the display:
      https://kilograham.github.io/r... [github.io]

  • Here's a functioning Turing Machine Computer out of Lego:
    https://ideas.lego.com/project... [lego.com]

    That's a good start. And a fantastically interesting video.

Basic is a high level languish. APL is a high level anguish.

Working...