
Developer Runs Doom On $50 Apple Lightning To HDMI Adapter 90
A developer has successfully run the classic video game Doom on Apple's $50 Lightning to HDMI adapter, exploiting the device's built-in system-on-chip that runs a simplified iOS version.
Probably more power than a 386 Desktop (Score:5, Informative)
The iOS SOC in that adapter probably has more capability than the whole desktops that ran it initially, so sure why not.
Re: (Score:2, Troll)
Re: (Score:1)
Re: Probably more power than a 386 Desktop (Score:2)
Limitation (Score:3)
Apple made a custom asic to make the lives of third parties harder
Well, also because Lightning port dates from antiquity (at least in computing terms), and thus doesn't have any capability to carry any display signal at all (as opposed to USB-C in Display Port Altmode, or to MHL-enabled microUSB connector).
The only way to make an iShiny device display on a screen is to have the device "kind-of-sort-of-AirPlay" its screen to the adapter. And inside the adapter have a miniature computer with an actual HDMI-out run a streaming video player.
But yes, they did it in the most an
Re: (Score:3)
also because Lightning port dates from antiquity (at least in computing terms), and thus doesn't have any capability to carry any display signal at all
That's not why Lightning can't do those things. It's because it would have been incompatible with Apple's DRM chip in the cable, which is the reason why Lightning is crap in general. As you note, you could already do it with Micro when Lightning was announced. What most people don't know is that Type C was also brought to the USB-IF before any devices with Lightning were released... And Apple is a member of the USB-IF.
Re: (Score:2)
also because Lightning port dates from antiquity (at least in computing terms), and thus doesn't have any capability to carry any display signal at all
That's not why Lightning can't do those things. It's because it would have been incompatible with Apple's DRM chip in the cable, which is the reason why Lightning is crap in general. As you note, you could already do it with Micro when Lightning was announced. What most people don't know is that Type C was also brought to the USB-IF before any devices with Lightning were released... And Apple is a member of the USB-IF.
But USB-C wasn't even Ratified when all this was going on. Apple was under the gun by the EU to adopt microUSB (insert Eyeroll!) for their "Charging Cable Standard". Because Compliance was allowed with an Adapter Cable, and because Apple was looking to ditch its 30-pin "Dock Connector" anyway, they Developed Lightning, and shipped iPhones and iPads in the EU with a lightning to USB Adapter Cable.
BTW, Apple freely contributed its Direction-Agnostic feature from Lightning to the USB-IF (ever wonder why there
Re: (Score:3)
Apple freely contributed its Direction-Agnostic feature from Lightning to the USB-IF (ever wonder why there were no Patent Challenges to that Clearly Patented Feature?), or we'd all still be flipping our USB-C Connectors! Consider that, every single time you plug in a USB-C or Lightning Cable. . .
It's not relevant because USB does it in a different way.
Every time I plug in a lightning cable, what I think of is that my work iPhone charges with a Type C to lightning cable, and the Lightning end is the end that's having problems.
Reversible cable (Score:2)
BTW, Apple freely contributed its Direction-Agnostic feature from Lightning to the USB-IF
Lightning's and USB-C's reversibility don't absolutely work the same way.
Lightning cable have the exact same pins mirrored on both side of the connector, so no matter how you plugged in, the receptacle still sees the same signal at the same spot. It 100% entirely relies on rotational symetry to achieve its goals, and a device can works by only really using pins from one side.
USB-C has a some pins which are symmetrical (power pins and USB2 data pins if in USB mode) so it could work either way, but not all of
DRM (Score:2)
It's because it would have been incompatible with Apple's DRM chip in the cable,
What has the chip that confirms it's an original Apple cable has todo with the signal that carried over the cable?
I mean in theory you could imagin a "Lightning 2.0" standard, that still uses the same pin out, that still uses the same chip to confirm it's an Apple approved cables, but is physically capable to carry much higher bandwidth signal and has a way to negociate to send HDMI's TMDS down those new fast data lanes instead of USB.
And in fact, Apple has done something similar, with some of the later iPa
Re: (Score:2)
Nope. Apple made a custom asic to make the lives of third parties harder
And somebody Ported DOOM to an ASIC?!?
Lightning-to-HDMI indeed, but worse. (Score:5, Informative)
My guess is that the process of negotiation and conversion for lightning and HDMI is computationally expensive enough,
You're on the right track but the details are even worse than that.
Unlike USB-C (which in "Alt mode" can map Display Port to some of the high speed lanes, instead of USB3), and micro USB (which can be used in "MHL" mode, i.e., the entire port speaks HDMI instead of USB2 and can be connected to a screen's regular HDMI using a special MHL cable), Apple's Lightning doesn't have any capability to carry an actual display signal. At all. Zilch. Nada. No HDMI, No DisplayPost.
So the only way they could do an HDMI adapter is by streaming compressed videos to the adapter and the adapter itself generating its own HDMI display signal.
The same kind of setup if you would be sharing your screen over Wifi with AirPlay to another Apple computer, except they communicate over USB instead of Wifi.
So basically you have a kind of miniature very-stripped down "apple-tv-like" device inside the adapter, and the Apple iShiny streams compressed video to it.
See this source [engadget.com], though they say it doesn't run a stripped down iOS. Though technically they're right as that other source [hackaday.com] mentions: it only runs a boot loader, and the "ultra-stripped-down" system that runs in there is sent each time when initializing the adapter (which would allow to improve video quality and reduce artifacting as software improves).
Given that Doom could intially run on a 386 and has even been ported to Raspberry Pico [github.com] someone was bound to find a way to cram it into an adapter that has an entire mini computer designed for streaming video (it would almost sound luxurious on the scale of other micro devices Doom has been crammed into).
The magic which is new to this accomplishment is finding a way to send the Doom image to run instead of the stripped down iOS that the iShiny would be normally sending, despite all the DRM shenigans that Apple pulls.
According to the author, it's very similar to hacking an iOS device of the same vintage to (can't link to the Youtube comment, so I am copy pasting verbatim nyan_satan's explanation):
Re: (Score:2)
Doom ran on a 386, which released in 1985. There's not much value in mass producing chips weaker than that today.
Re: (Score:2)
Crawled, more like. It only ran on a 486 DX.
Re: (Score:2)
Crawled, more like. It only ran on a 486 DX.
It ran on a Super Nintendo with a co-processor that definitely was not a 486 DX.
Re: (Score:2)
SNES Doom is a quite different game, that utilizes textures and most of the levels of Doom. It uses a different engine that lacks a lot of graphics features (flats, transparent textures), has monster textures for only one side rather than eight, lacks most of monster AI, etc. And even with that limitations, it shows only a small window with halved resolution.
Re: Probably more power than a 386 Desktop (Score:2)
And what version of the game do you think is running on this SoC?
Re: Probably more power than a 386 Desktop (Score:2)
Probably the dos version since it makes very little assumptions about the hardware present. SNES is the exact opposite.
Re: (Score:2)
I'd guess it's more probably a port of Linux Doom, since that's designed for a more similar environment to what's running on this device, and it was still both as simple as possible and a CPU-rendered, framebuffer-drawn game.
I ran Linux Doom on a 386DX25 with 8MB RAM and a relatively speedy 1MB Trident VGA card. It worked OK... at 320x200.
I ran Quake on a 486DX25 with 16MB, too. It worked, not great, but OK... at the same resolution. Doom would work smoothly at up to 512x384 or so.
Re: (Score:2)
Agreed. I remember lurking on BBS forums on my 386 reading about how we would all need to upgrade to a 486 with a whopping 4MB of RAM to play it properly! One of the most anticipated games ever until Quake in those days. Ah the memories.
Re: (Score:2)
Re: (Score:2)
I had a 486SX back then. Doom worked on it, but not smoothly.
Re: (Score:2)
Doom ran fine on a 386 [reddit.com].
Re: (Score:2)
Re: (Score:2)
The protocol for video over lightning requires compressing the video down, so inside the adapter is a little Apple SoC that runs iOS on it. Its primary purpose is to decompress the video being sent to it over Lightning.
Lightning didn't have enough bandwidth to carry video so that's what they did to support it. If the content is say, app content then there's significant black space that simply doesn't have to be sent so the
Re: (Score:2)
The protocol for video over lightning requires compressing the video down, so inside the adapter is a little Apple SoC that runs iOS on it. Its primary purpose is to decompress the video being sent to it over Lightning.
Lightning didn't have enough bandwidth to carry video so that's what they did to support it. If the content is say, app content then there's significant black space that simply doesn't have to be sent so the video data can carry more app content, and the adapter handles the necessary scaling/stretching and black bars.
Of course, one needs to realize it was introduced in 2012, when USB2 was basically the fastest interface around, and lightning didn't offer much more than that in bandwidth. It's possible the lightning altmodes inspired the USB-C altmode as well.
Very possible, since Apple was on the USB-IF Committee, and they made other significant contributions to the USB-C Standard; most notably, Lightning's Patented Direction-Agnostic Connection Scheme and Hardware Design; which is exactly why we don't have to Flip our USB-C (and Lightning) Connectors Three Times. . .
Re: (Score:2)
I guess my question is... why the fuck do we need a chip like that in an hdmi adapter
Fast-to-Market.
They already had the software done and dusted for the same, or very similar, SoC. Plus, they were probably already ordering plenty of that particular SoC for something(s) else; so diverting a few percent to this low-volume accessory was obviously no problem.
Add to that the much faster Engineering Turnaround, and it makes perfect sense.
But I agree, it is still kinda amusing!
And what else can it run? (Score:1)
Yes, indeed DRM (Score:2)
and a way to validate code signatures.
Yes, according to a comment from the author, the magic is hacking around the DRM, and is achieved using exploits for iOS of the same vintage, because that embed computer inside the adapter runs(*) a very stripped down version of iOS.
(*): Well technically it doesn't have any local storage, so in practice the adapters waits for that stripepd-down iOS to be sent from the iShiny device each time it is initialised (normally allowing for better image quality as the software improves, or sending a hacked firmware
Typical Apple. (Score:5, Funny)
I looked it up. You can't upgrade the RAM. Or anything else for that matter. Why anybody would lock themselves in like that is beyond me.
Re: (Score:2)
was this meant to be funny? it wasnt. maybe try again next time with more humor.
I disagree with you. Does that make me correct?
Re: (Score:2)
I disagree with you. Does that make me correct?
Odds are good.
Re: (Score:3)
I disagree with you. Does that make me correct?
Odds are good.
Thank you, Magic 8 Ball.
Re: (Score:2)
I think that perhaps what you need isn't "more humour"... but "simpler humour". Deadpan is, after all, more reliant on the active participation of the recipient than are other forms of comedy. Delivering deadpan to dumb people falls flat.
Not to say you're dumb... but to imply it.
Re: (Score:2)
It made me laugh. It was funny. Lighten up, you'll live longer.
Re: (Score:2)
Well now you've ruined my day. If I can't run FS2024 on it what's the point......
Re: (Score:2)
If I can't run FS2024 on it what's the point......
To be fair, nothing can run FS2024...
Re:Typical Apple. (Score:4, Interesting)
put Linux on it. Then you can run ZRAM to increase available memory and a swap drive over the USB connector. Basically infinite amount of address space in a tiny little box.
Re: (Score:2)
I looked it up. You can't upgrade the RAM. Or anything else for that matter. Why anybody would lock themselves in like that is beyond me.
I'd ask why a display port converter needs RAM at all... let alone enough processing power that it can run a 486 game.
Re: (Score:2)
Think of it as being nearly firmware-free. Since it is loaded externally on connection, it always has the compatible version. It's not just shifting the video like for like. The devices have different resolutions and aspect ratios.
So what? (Score:2)
These have gotten lame now. Any Turing machine runs doom. And no, I don't want to see doom running on tape moving machine. How about doing something else for a change? I like the adaptor hacking part, but then running Doom on it is like whatever. Don't get me wrong I loved Doom back in 1993 or whenever but did we really need to keep rehashing Doom and only Doom? It had a great contribution to gaming and 3D but how about maybe run a demo of Maze war and then show some video game evolution all the way up to a
Re: (Score:2)
Doing it with a decent framerate at least demonstrates some decent MIPS and I/O capabilities that aren't addressed in the Turning Machine model of computing. That's the problem with a TM. It can't tell you the time of day, because execution time is a side effect that lives outside of that theoretical model.
Re: (Score:2)
decent MIPS and I/O capabilities that aren't addressed in the Turning Machine model of computing
Er, you might want to dig out your old notes. Or borrow someone else's.
That's the problem with a TM. It can't tell you the time of day
Don't be so sure [youtube.com]
because execution time is a side effect that lives outside of that theoretical model
Hmm... Maybe skip the notes and dust off your old text book.
Re: (Score:2)
Cute video. not actually using Turing's description. But that's fine, it's enjoyable at least.
It's was extremely useful for generalizing computational theory and algorithms and in demonstrating the halting problem.
You might want to check your own notes on what an algorithm is.
Re: (Score:2)
You might want to check your own notes on what an algorithm is.
Where do you believe I've gone wrong?
Re: (Score:2)
Re: (Score:2)
I really must have been an impossibly early adopter of the Orange King's cult to have such a low uid.
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
The existence of busy beaver machines shows that step rate is part of the Turing model.
Feel free to do something better yourself (Score:1)
Thought I suspect you wouldn't know where to start.
Re: (Score:2)
Re:So what? (Score:5, Informative)
The astounding thing about a waltzing bear is not how gracefully it waltzes but that it waltzes at all. - I think this is from Robert A. Heinlein's Time Enough for Love.
It seems Doom on obscure platform [whatever] has become the new waltzing bear. Yeah, lame in terms of utility, but good for geek creds.
Now I'd like to see Doom running as an Excel spreadsheet. Oh, wait. [gamedeveloper.com]
Re: (Score:2)
Pfff. Excel is Turing complete so that doesn't impress me. What impresses me is now when you wife tells you that you're Doomed for the next 18 years, you don't know if it is because her pregnancy test was positive, or if she was actually simply playing Doom on it. https://www.youtube.com/watch?... [youtube.com]
Re:So what? (Score:4, Interesting)
These have gotten lame now. Any Turing machine runs doom.
Well, yes, but. I'm continuously astounded (a) what everyday items have embedded SoC's on them and (b) how creative people get reprogramming those controllers.
I'm also gobsmacked the easiest way to make the dongle work was to produce a stripped down iOS. I mean, I get it, you've got the OS source just sitting there in your local repo and perhaps sharing key exchange code would be handy. But surely a video dongle could just run some code on the bare silicon in real mode. Why does it need an OS at all?
Re: (Score:2)
Why does it need an OS at all?
So the "right" group can slip something in there if needed... but it'll just be used against the bad guys. For real this time!
Re: (Score:3)
Why does it need an OS at all?
So the "right" group can slip something in there if needed... but it'll just be used against the bad guys. For real this time!
That SoC doesn't even have a ROM. The firmware is loaded every time it's plugged in.
And wtf would an iPhone-HDMI dongle be an attack vector for, like you couldn't just sneak a SoC into a generic charger adapter instead.
Re: (Score:2)
That SoC doesn't even have a ROM. The firmware is loaded every time it's plugged in.
So, where's the bootloader? Does the host RDMA the code to memory and use some weird signal to reset/start the processor?
Re: (Score:1)
>Why does it need an OS at all?
It doesn't, and my cynical opinion, it's probably because whoever did the design at Apple simply didn't know how to write code for an embedded processor without an operating system.
Think of how many developers design docker images for cloud services that contain bloated operating systems to run a microservice. Or node projects with dozens of dependencies for trivial functions. Or how many "desktop apps" are just bloated 100MB-300MB electron webapps.
If it works acceptably, i
Re: (Score:3)
These have gotten lame now. Any Turing machine runs doom.
Well, yes, but. I'm continuously astounded (a) what everyday items have embedded SoC's on them and (b) how creative people get reprogramming those controllers.
I'm also gobsmacked the easiest way to make the dongle work was to produce a stripped down iOS. I mean, I get it, you've got the OS source just sitting there in your local repo and perhaps sharing key exchange code would be handy. But surely a video dongle could just run some code on the bare silicon in real mode. Why does it need an OS at all?
It's not an OS by any definition, the firmware is just a kernel that runs a single process, not even a filesystem. The dev firmware apparently has a little filesystem with some debug utilities, that's what this guy used.
https://panic.com/blog/the-lig... [panic.com]
Re: (Score:2)
It's not an OS by any definition, the firmware is just a kernel that runs a single process, not even a filesystem.
Which all sounds more and more reasonable. I just wonder what services the rump OS provides. Memory management? Interrupts? Threading? Device drives? Encryption/decryption? I find it hard you need any of that.
About the only think I can think you'd want is initializing the hardware. There's often an annoying amount of gobbledygook you need to do to get the processor running, even in real mode.
Re: (Score:2)
I think the bigger takeaway is the absurdity of an HDMI cable requiring an embedded SOC
Lightning limitations (Score:3)
the absurdity of an HDMI cable requiring an embedded SOC
It comes from how shitty Lightning is. There's no way to output an actual display signal on it. No DisplayPort nor HDMI (as oposed to, say, a USB-C in DP AltMode or a MHL-capable microUSB).
The only way to get a screen output out of an iShiny device is to "sort-of-kind-of-AirPlay" the display over to the adapter, where a tiny stripped down computer runs a video stream player to generate the actual HDMI signal.
Re: (Score:2)
Re: (Score:2)
Imagine a beowulf cluster of these bad boys!
Re: So what? (Score:2)
Re: (Score:2)
Running Doom is the 'hello world' of graphical apps now
Why does it need a full OS install? (Score:3)
Just to convert from one serial protocol to another? Talk about overkill.
Re:Why does it need a full OS install? (Score:5, Insightful)
Re: (Score:2)
+1 insightful.
Re: (Score:2)
Just to convert from one serial protocol to another? Talk about overkill.
HDMI is not a serial protocol.
Nobody said the SoC is running a "full" OS.
Doom doesn't require a "full" OS, it ran in 16-bit real mode on DOS.
https://panic.com/blog/the-lig... [panic.com]
Re: (Score:2)
Doom most certainly didn't run in 16 bit real mode.
https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
Doom wasn't a 16bit real mode program. It very much was dependent on DOS's ability to run software in protected mode. There is however a project that attempted to port Doom to 16bit real mode, called RealDOOM.
Re: (Score:3)
Just to convert from one serial protocol to another? Talk about overkill.
Because Lightning is too slow to be usable for full-frame video. It is limited to 480 Mbps (USB 2.0 speeds) on all but a small handful of devices.
To work around that problem, Apple encodes MPEG video or similar using the hardware codec in their chipset, then decompresses it on the other side. Basically, it is AirPlay, just with the data sent over USB. This allows them to get down to a data rate that is feasible over such a slow link. It also means that the device has to basically be a small computer.
Re: (Score:2)
Just to convert from one serial protocol to another? Talk about overkill.
Is it really such a wild idea that they had a bunch of chips sitting in a warehouse they wanted to use up?
That's pretty good, but can it run ... (Score:3)
... errrm ... nevermind.
Old nerds will one day run Doom on everything (Score:4, Funny)
The older I get the more convinced I am that one day I will see someone running Doom on a hearing aid.
Re: (Score:2)
Why run Doom on something you stick in your ear when you can run it on something you can stick in ... https://www.youtube.com/watch?... [youtube.com] (safe for work fellas, I don't think this pregnancy test was used for anything other than running Doom).
Re: (Score:2)
I am waiting for someone to port it to Casio Data Bank 150 calculator watch!
Runs well on a Raspberry Pico for $3 (Score:3)
Not bitbanding, PIO (Score:2)
a board with dual-core 133 MHz CPU (+ PIO helper "cores") {...} but the CPU cores are fast enough that they can bit-bang the video and sound signals:
Well according to the author, it's not the CPU directly bitbanging, but the one of the PIO state-machine cores doing the display:
https://kilograham.github.io/r... [github.io]
Challenge: Run Doom on Legos (Score:2)
Here's a functioning Turing Machine Computer out of Lego:
https://ideas.lego.com/project... [lego.com]
That's a good start. And a fantastically interesting video.