Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Microsoft AI IT

Microsoft and Nvidia Are Making It Easier To Run AI Models on Windows (theverge.com) 14

Microsoft and Nvidia want to help developers run and configure AI models on their Windows PCs. During the Microsoft Ignite event on Wednesday, Microsoft announced Windows AI Studio: a new hub where developers can access AI models and tweak them to suit their needs. From a report: Windows AI Studio allows developers to access development tools and models from the existing Azure AI Studio and other services like Hugging Face. It also offers an end-to-end "guided workspace setup" with model configuration UI and walkthroughs to fine-tune various small language models (SLMs), such as Microsoft's Phi, Meta's Llama 2, and Mistral.

Windows AI Studio lets developers test the performance of their models using Prompt Flow and Gradio templates as well. Microsoft says it's going to roll out Windows AI Studio as a Visual Studio Code extension in the "coming weeks." Nvidia, similarly, revealed updates to TensorRT-LLM, which the company initially launched for Windows as a way to run large language models (LLMs) more efficiently on H100 GPUs. However, this latest update brings TensorRT-LLM to PCs powered by GeForce RTX 30 and 40 Series GPUs with 8GB of RAM or more.

This discussion has been archived. No new comments can be posted.

Microsoft and Nvidia Are Making It Easier To Run AI Models on Windows

Comments Filter:
  • by Press2ToContinue ( 2424598 ) on Wednesday November 15, 2023 @03:06PM (#64007971)
    Ah, Microsoft and Nvidia teaming up to make AI more accessible on Windows - because what could possibly go wrong? It's like watching two supervillains join forces, except the world they're trying to conquer is just my desktop. I can't wait for the 'AI model failed to load' error to join the pantheon of classic Windows errors. It's like Clippy's revenge: 'It looks like you're trying to run an AI model. Would you like help crashing your system?'

    And let's not forget the nostalgic throwback to when running anything GPU-intensive on Nvidia hardware meant turning your PC into a makeshift room heater. Now with TensorRT-LLM, you can relive those days while pretending to do groundbreaking AI research.

    So, grab your popcorn and watch as your AI dreams clash with the harsh reality of Windows updates and Nvidia drivers. It's like watching a reboot of 'The Matrix,' but the only thing that's dodging bullets is your system stability. And for those of us who remember the days of running simpler AIs like BonziBuddy, this is like stepping into a time machine, only to realize it's powered by Windows ME.

    At least with Windows AI Studio, we can finally answer the age-old question: 'Can an AI write a better BSOD message?' Spoiler alert: Yes, it can, and it will be sarcastically witty.
    • by ls671 ( 1122017 )

      Microsoft and Nvidia Are Making It Easier To Run AI Models on Windows

      I guess it would be nice if Microsoft made *everything* easier to run on Windows, like a web server which respect casing in URLs etc.

    • I expect the BSOD message to be something along the lines of

      "Well, I donâ(TM)t think there is any question about it. It can only be attributable to human error. This sort of thing has cropped up before, and it has always been due to human error."

      • I expect the BSOD message to be something along the lines of

        "Well, I don't think there is any question about it. It can only be attributable to human error. This sort of thing has cropped up before, and it has always been due to human error."

        And now that you have posted it online, in its next web scrap cycle the AI will know what text to put in there as the answer most likely to be shown in the context of an AI-generated BSOD message.

      • by rune2 ( 547599 )
        Bonus points if the messages are presented in the style or voice of GLaDOS or Cave Johnson.
  • This is good (Score:3, Insightful)

    by NomDeAlias ( 10449224 ) on Wednesday November 15, 2023 @03:29PM (#64008005)
    Many people like to use their GPU for gaming and AI Models. Dual booting is a pain in the ass and Linux gaming has always and will always suck.
    • Re: (Score:2, Informative)

      by sarren1901 ( 5415506 )

      Speak for yourself! I'm loving my Baldur's Gate 3 experience on Linux. Runs wonderful. I have tons of games I enjoy regularly. Sure, many are so mainstream anymore but they all work perfectly on Linux. Wurm Unlimited, Mount & Blade:Warband (pop3), and Baldur's Gate 3 are the main ones these days. Neverwinter Night's EE works great and the toolset even works now! Sure, many of these are older games but most newer games aren't exactly doing anything special or stellar themselves.

      I guess if you want to enj

      • LOL 5 seconds on Google shows that game works best on Windows and people have emulation issues on Linux. Like pretty much every game.
        • You clearly have an axe to grind or maybe just hate it that Linux can be a gaming platform just fine. Steam has done a lot of work in this area for their steamdeck but that same work has done many things to bring stable gaming to Linux with minimal effort.

          Obviously a game written for a specific platform, Windows in this case, will likely work better or with less issues (but not always). I've had a lot of luck in the past several years with Linux gaming and that fact that I was able to buy Baldur's Gate 3 ne

          • Umm I've been explicit about the axe I have to grind, it's called gaming on Linux. Stop taking it as some personal affront it's just a known fact that gaming on Linux has always been a pain in the ass.
  • by MpVpRb ( 1423381 )

    The article kinda says that things will run locally, then sneaks in the word "cloud"
    My conclusion, it's NOT local!

  • [joke]At least AI models might work better than Windows Update...[/jpke]

"If you don't want your dog to have bad breath, do what I do: Pour a little Lavoris in the toilet." -- Comedian Jay Leno

Working...