Living on the Edge: New Features for NVIDIA Fleet Command Deliver All-in-One Edge AI Management, Maintenance for Enterprises

NVIDIA Fleet Command — a cloud service for deploying, managing and scaling AI applications at the edge — now includes features that enhance the seamless management of edge AI deployments around the world.

With the scale of edge AI deployments, organizations can have up to thousands of independent edge locations that must be managed by IT teams — sometimes in far-flung locations like oil rigs, weather gauges, distributed retail stores or industrial facilities.

NVIDIA Fleet Command offers a simple, managed platform for container orchestration that makes it easy to provision and deploy AI applications and systems at thousands of distributed environments, all from a single cloud-based console.

But deployment is just the first step in managing AI applications at the edge. Optimizing these applications is a continuous process that involves applying patches, deploying new applications and rebooting edge systems.

To make these workflows seamless in a managed environment, Fleet Command now offers advanced remote management, multi-instance GPU provisioning and additional integrations with tools from industry collaborators.

Advanced Remote Management 

IT administrators now can access systems and applications with sophisticated security features. Remote management on Fleet Command offers access controls and timed sessions, eliminating vulnerabilities that come with traditional VPN connections. Administrators can securely monitor activity and troubleshoot issues at remote edge locations from the comfort of their offices.

Edge environments are extremely dynamic — which means administrators responsible for edge AI deployments need to be highly nimble to keep up with rapid changes and ensure little deployment downtime. This makes remote management a critical feature for every edge AI deployment.

Check out a complete walkthrough of the new remote management features and how they can be used to help administrators maintain and optimize even the largest edge deployments.

Multi-Instance GPU Provisioning 

Multi-Instance GPU, or MIG, partitions an NVIDIA GPU into several independent instances. MIG is now available on Fleet Command, letting administrators easily assign applications to each instance from the Fleet Command user interface. By allowing organizations to run multiple AI applications on the same GPU, MIG lets organizations right-size their deployments and get the most out of their edge infrastructure.

Learn more about how administrators can use MIG in Fleet Command to better optimize edge resources to scale new workloads with ease.

Working Together to Expand AI

New Fleet Command collaborations are also helping enterprises create a seamless workflow, from development to deployment at the edge.

Domino Data Lab provides an enterprise MLOps platform that allows data scientists to collaboratively develop, deploy and monitor AI models at scale using their preferred tools, languages and infrastructure. The Domino platform’s integration with Fleet Command gives data science and IT teams a single system of record and consistent workflow with which to manage models deployed to edge locations.

Milestone Systems, a leading provider of video management systems and NVIDIA Metropolis elite partner, created AI Bridge, an application programming interface gateway that makes it easy to give AI applications access to consolidated video feeds from dozens of camera streams. Now integrated with Fleet Command, Milestone AI Bridge can be easily deployed to any edge location.

IronYun, an NVIDIA Metropolis elite partner and top-tier member of the NVIDIA Partner Network, with its Vaidio AI platform applies advanced AI, evolved over multiple generations, to security, safety and operational applications worldwide. Vaidio is an open platform that works with any IP camera and integrates out of the box with dozens of market-leading video management systems. Vaidio can be deployed on premises, in the cloud, at the edge and in hybrid environments. Vaidio scales from one to thousands of cameras. Fleet Command makes it easier to deploy Vaidio AI at the edge and simplifies management at scale.

With these new features and expanded collaborations, Fleet Command ensures that the day-to-day process of maintaining, monitoring and optimizing edge deployments is straightforward and painless.

Test Drive Fleet Command

To try these features on Fleet Command, check out NVIDIA LaunchPad for free.

LaunchPad provides immediate, short-term access to a Fleet Command instance to easily deploy and monitor real applications on real servers using hands-on labs that walk users through the entire process — from infrastructure provisioning and optimization to application deployment for use cases like deploying vision AI at the edge of a network.

The post Living on the Edge: New Features for NVIDIA Fleet Command Deliver All-in-One Edge AI Management, Maintenance for Enterprises appeared first on NVIDIA Blog.

CORSAIR Integrates NVIDIA Broadcast’s Audio, Video AI Features in iCUE and Elgato Software This Week ‘In the NVIDIA Studio’

Editor’s note: This post is part of our weekly In the NVIDIA Studio series, which celebrates featured artists, offers creative tips and tricks, and demonstrates how NVIDIA Studio technology accelerates creative workflows. 

Technology company CORSAIR and streaming sensation BigCheeseKIT step In the NVIDIA Studio this week.

A leader in high-performance gear and systems for gamers, content creators and PC enthusiasts, CORSAIR has integrated NVIDIA Broadcast technologies into its hardware and iCUE software. Similar AI enhancements have also been added to Elgato’s audio and video software, Wave Link and Camera Hub.

Powerful Broadcast AI audio and video effects transform content creation stations into home studios.

Creators and gamers with GeForce RTX GPUs can benefit from NVIDIA Broadcast’s AI enhancements to CORSAIR and Elgato microphones and cameras, elevating their live streams, voice chats and video conference calls.

Plus, entertainer Jakeem Johnson, better known by his Twitch name BigCheeseKIT, demonstrates how a GeForce RTX 3080 GPU elevates his thrilling streams with AI-powered benefits.

Advanced AI Audio

The NVIDIA Broadcast integration enables AI-powered noise removal and room echo removal audio features in CORSAIR iCUE and Elgato Wave Link that unlocks new levels of clarity and sharpness for an exceptional audio experience.

 

Noise removal in Elgato Wave Link is built as a virtual studio technology (VST), enabling users to apply the effect per audio channel, and supported in compatible creative apps such as Adobe Premiere Pro, Audition and Blackmagic DaVinci Resolve.

Running on Tensor Cores on GeForce RTX GPUs, the new features use AI to identify users’ voices, separating them from other ambient sounds. This results in noise cancellation that dramatically improves audio and video call quality. Background noises from fans, chatter, pets and more disappear, leaving the speaker’s voice crystal clear.

Broadcast also cancels room echoes, providing dampened, studio-quality acoustics in a wide range of environments without the need to sound-proof walls or ceilings.

CORSAIR’s integration takes a new version of these effects that can separate body sounds. This upgrade adds support to popular capabilities, like muting the friend who forgets to turn on the push-to-talk feature on a video call while they chew their lunch.

AI audio effects are ready to be integrated into nearly the entire lineup of CORSAIR headsets.

These Broadcast features are available on nearly the entire lineup of CORSAIR headsets. Users seeking a premium audio experience should consider headsets like the VOID RGB ELITE WIRELESS with 7.1 surround sound, HS80 RGB WIRELESS with spatial audio or the VIRTUOSO RGB WIRELESS SE.

The Elgato Wave XLR unlocks AI audio effects.

For Elgato creators, noise removal can now be enabled in the Wave Link app. This makes AI-enhanced audio possible for Wave mic users, plus XLR microphones thanks to the Elgato Wave XLR.

Unrestr(AI)ned Video Effects

NVIDIA Broadcast’s video technologies integrated into the Elgato Camera Hub include the virtual background feature.

The ‘background replacement’ AI video feature.

AI-enhanced filters powered by GeForce RTX GPUs offer better edge detection to produce a high-quality visual — much like those produced by a DSLR camera — using just a webcam. Supported effects include blur and replacing the background with a video or still image; eliminating the need for a greenscreen.

 

Background blur and background replacement are now available in Elgato Camera Hub. Creators can apply AI video effects with Facecam, or their studio camera using Cam Link 4K.

Set Up for Streaming Success

Accessing these NVIDIA Broadcast technologies is fast and simple.

If an eligible CORSAIR headset or the ST100 headset stand is recognized by iCUE, it will automatically prompt installation of the NVIDIA Broadcast Audio Effects.

Elgato Camera Hub now features a new Effects tab. Once selected, users will be prompted to download and install Broadcast Video Effects. For Elgato Wave Link, creators will first need to install the Broadcast Audio Effects, followed by the new noise removal VST.

After installation, Broadcast options will appear within iCUE, Wave Link and Camera Hub.

Check out the installation instructions and FAQ.

Broadcast features require GeForce RTX GPUs that can be found in the latest NVIDIA Studio laptops and desktops. These purpose-built systems feature vivid color displays, along with blazing-fast memory and storage to boost streams and all creative work.

Pick up an NVIDIA Studio system today to turn streams into dreams.

Stream Like a Boss In the NVIDIA Studio

If there’s one thing BigCheeseKIT encapsulates, it’s energy.

BigCheeseKIT enjoyed early success as a Golden Joystick award nominee, serving as an ambassador for Twitch and Norton Gaming. He said that the highlight of his career, undoubtedly, was joining T-Pain’s exclusive gaming label, Nappy Boy Gaming.

A natural entertainer, BigCheeseKIT’s presence, gaming knowledge and authenticity dazzle his 60,000+ subscribers. Powered by his GeForce RTX 3080 GPU and live-streaming optimizations from NVIDIA Studio — such as better performance in OBS Studio, BigCheeseKIT has the resources and know-how to host professional streams.

“It’s like having my own television channel, and I’m the host or entertainer,” said the artist.

BigCheeseKIT streams exclusively with OBS Studio, benefitting massively from the dedicated GPU-based NVIDIA Studio encoder (NVENC), which enables seamless streaming with maximum performance.

“Using NVENC with my live streams makes my quality 20x better,” said BigCheeseKIT. “I can definitely see the difference.”

“Quality and consistency,” BigCheeseKIT noted. “NVIDIA hasn’t failed me.”

OBS Studio’s advanced GPU-accelerated encoding also unlocks higher video quality for streaming and recorded videos. Once he started using it, BigCheeseKIT’s system immediately became built to broadcast.

For on-demand videos, BigCheeseKIT prefers to edit using VEGAS Pro. MAGIX’s professional video editing software takes advantage of GPU-accelerated video effects while using NVENC for faster encoding. Overall, the artist said that his creative workflow — charged by his GPU — became faster and easier, saving valuable time.

For aspiring streamers, BigCheeseKIT offered these words of wisdom: “Stream like everyone is watching. Be yourself, have fun and don’t let negativity get to you.”

Nappy Boy Gaming’s newest member: BigCheeseKIT.

Head over to BigCheeseKIT’s Twitch channel to subscribe, learn more and check out his videos.

NVIDIA Broadcast and the SDKs behind it — which enable third-party integrations like the ones described above — are part of the NVIDIA Studio tools that include AI-powered software and NVIDA Studio Drivers.

Follow NVIDIA Studio on Instagram, Twitter and Facebook. Access tutorials on the Studio YouTube channel and get updates directly in your inbox by signing up for the NVIDIA Studio newsletter.

The post CORSAIR Integrates NVIDIA Broadcast’s Audio, Video AI Features in iCUE and Elgato Software This Week ‘In the NVIDIA Studio’ appeared first on NVIDIA Blog.

Meet the Omnivore: Animator Entertains and Explains With NVIDIA Omniverse

Editor’s note: This post is a part of our Meet the Omnivore series, which features individual creators and developers who use NVIDIA Omniverse to accelerate their 3D workflows and create virtual worlds.

Australian animator Marko Matosevic is taking dad jokes and breathing them into animated life with NVIDIA Omniverse, a virtual world simulation and collaboration platform for 3D workflows.

Matosevic’s work is multifaceted: he’s a VR developer by day and a YouTuber by night, a lover of filmmaking and animation, and has a soft spot for sci-fi and dad jokes.

The animated shorts on Matosevic’s YouTube channels Markom3D and Deadset Digital are the culmination of those varied interests and pursuits.

To bring the above film to life, Matosevic harnessed Reallusion iClone and Character Creator for character creation, Perception Neuron V3 for body motion capture, and NVIDIA Omniverse and Epic Games Unreal Engine 5 for rendering.

After noting a lack of YouTube tutorials on how to use animation software like Blender, Matosevic also set himself up as an instructor. He says his goal is to help those at all developmental ranges — from beginners to advanced users — learn new skills and techniques in a concise manner. The following video is a tutorial of the previous film:

Matosevic says his ultimate goal in creating these animated shorts is to make his viewers “have a laugh.”

“Through rough times that we are going through at the moment, it is nice to just let yourself go for a few moments and enjoy a short animation,” he said. “Sure, the jokes may be terrible, and a lot of people groan, but I am a dad, and that is part of my responsibility.”

Optimized Rendering and Seamless Workflow

Initially, Matosevic relied primarily on Blender for 3D modeling and Unreal Engine 4 for his work. It wasn’t until he upgraded to an NVIDIA RTX 3080 Ti GPU that he saw the possibility of integrating the NVIDIA Omniverse platform into his toolbox.

“What really got me interested was that NVIDIA [Omniverse] had its own render engine, which I assumed that my 3080 would be optimized for,” he said. “I was able to create an amazing scene with little effort.”

With NVIDIA Omniverse, Matosevic can export whole scenes from Unreal Engine into Omniverse without having to deal with complex shaders, as he would’ve had to do if he were solely working in Blender.

Along with the iClone and Blender connectors, Matosevic uses NVIDIA Omniverse Machinima, an application that allows content creators to collaborate in real time to animate and manipulate characters along with their environments inside of virtual worlds.

“I like it because with a few simple clicks, I can start the rendering process and know that I am going to have something amazing when it is finished,” he said.

With Universal Scene Description, an open-source 3D scene description for creating virtual worlds, these applications and connectors work seamlessly together to bring elevated results.

“I have created animated short films using Blender and Unreal Engine 4, but Omniverse has just raised the quality to a new level,” he said.

Join In on the Creation

Creators across the world can download NVIDIA Omniverse for free and Enterprise teams can use the platform for their 3D projects.

Check out works from other artists using NVIDIA Omniverse and submit your own work with #MadeInOmniverse to be featured in the gallery.

Join us at SIGGRAPH 2022 to learn how Omniverse, and design and visualization solutions, are driving advanced breakthroughs in graphics workflows and GPU-accelerated software.

Connect your workflows to NVIDIA Omniverse with software from Adobe, Autodesk, Epic Games, Maxon, Reallusion and more.

Follow NVIDIA Omniverse on Instagram, Twitter, YouTube and Medium for additional resources and inspiration. Check out the Omniverse forums and join our Discord Server to chat with the community.

The post Meet the Omnivore: Animator Entertains and Explains With NVIDIA Omniverse appeared first on NVIDIA Blog.

Action on Repeat: GFN Thursday Brings Loopmancer With RTX ON to the Cloud

Investigate the ultimate truth this GFN Thursday with Loopmancer, now streaming to all members on GeForce NOW. Stuck in a death loop, RTX 3080 and Priority members can search for the truth with RTX ON — including NVIDIA DLSS and ray-traced reflections.

Plus, players can enjoy the latest Genshin Impact event with the “Summer Fantasia” version 2.8 update. It’s all part of the nine new games joining the GeForce NOW library this week.

Enter the Dragon City

The cycle continues until the case is solved. Loopmancer is streaming on GeForce NOW, with RTX ON.

Playing as a detective in this roguelite-platformer action game, members will wake back up in their apartments each time they die, bathed in the neon lights of futuristic Dragon City. As the story progresses, reviewing what seemed like the correct choice in the past may lead to a different conclusion.

Face vicious gangsters, well-equipped mercs, crazy mutants, highly trained bionics and more while searching for clues, even on mobile devices. Unlock new weapons and abilities through endless reincarnations to enhance your fighting skills for fast-paced battles.

RTX 3080 and Priority members can experience Loopmancer with DLSS for improved image quality at higher frame rates, as well as real-time ray-tracing technology that simulates the realistic, physical behavior of light, even on underpowered devices and Macs. Every loop, detail and map – from the richly colored Dragon Town to the gloomy Shuigou Village – is rendered with beautiful cinematic quality.

Ready to initiate a new loop? Try out the Loopmancer demo in the Instant Play Free Demos row before diving into the full game. RTX 3080 and Priority members can even try the demo with RTX ON.

A Summertime Odyssey Awaits

With a cursed blade of unknown origin, a mysterious unsolved case and the familiar — but not too familiar — islands far at sea, the recent addition of Genshin Impact heats up with the version 2.8 “Summer Fantasia” update, now available.

Meet the newest Genshin character, Shikanoin Heizou, a young prodigy detective from the Tenryou Commission with sharp senses. Members can also cool off with the new sea-based “Summertime Odyssey” main event, explore the Golden Apple Archipelago, experience new stories and dress their best with new outfits.

RTX 3080 members can stream all of the fun at 4K resolution and 60 frames per second, or 1440p and 120 FPS from the PC and Mac native apps. They also get the perks of ultra-low latency that rivals console gaming and can catch all of the newest action with the maximized eight-hour play sessions.

Summer Gamin’, Havin’ a Blast

 Fight through dystopian cyberspace and establish an exotic black market store in this rogue-lite, management, shoot ‘em up.

This week brings in a total of nine new titles for gamers to play.

Neon Blight (New release on Steam and Epic Games Store)
Loopmancer (New release on Steam)
Stones Keeper: King Aurelius (New release on Steam, July 14)
Wonder Boy: The Dragon’s Trap (New release free on Epic Games Store, July 14)
Dead Age 2 (Epic Games Store)
Huntdown (Steam)
Out There: Oceans of Time (Epic Games Store)
Titan Quest Anniversary Edition (Epic Games Store)
The Guild 3 (Epic Games Store)

With all of these awesome options to play and only so many hours in a day, we’ve got a question for you. Let us know your answer on Twitter or in the comments below.

You wake up tomorrow to relive today. How much more hours of gaming will you get in?

— NVIDIA GeForce NOW (@NVIDIAGFN) July 13, 2022

The post Action on Repeat: GFN Thursday Brings Loopmancer With RTX ON to the Cloud appeared first on NVIDIA Blog.

Grand Entrance: Human Horizons Unveils Smart GT Built on NVIDIA DRIVE Orin

Tourer vehicles just became a little more grand.

Electric vehicle maker Human Horizons provided a detailed glimpse earlier this month of its latest production model: the GT HiPhi Z. The intelligent EV is poised to redefine the grand tourer vehicle category with innovative, software-defined capabilities that bring luxurious cruising to the next level.

The vehicle’s marquis features include an in-vehicle AI assistant and autonomous driving system powered by NVIDIA DRIVE Orin.

The GT badge first appeared on vehicles in the mid-20th century, combining smooth performance with a roomy interior for longer joy rides. Since then, the segment has diversified, with varied takes on horsepower and body design.

The HiPhi Z further iterates on the vehicle type, emphasizing smart performance and a convenient, comfortable in-cabin experience.

Smooth Sailing

An EV designed to be driven, the GT HiPhi Z also incorporates robust advanced driver assistance features that can give humans a break on longer trips.

The HiPhi Pilot ADAS platform provides dual redundancy for computing, perception, communication, braking, steering and power supply. It uses the high-performance AI compute of NVIDIA DRIVE Orin and 34 sensors to perform assisted driving and parking, as well as smart summon.

DRIVE Orin is designed to handle the large number of applications and deep neural networks running simultaneously for autonomous driving capabilities. It’s architected to achieve systematic safety standards such as the ISO 26262 ASIL-D.

With this high level of performance at its core, the HiPhi Pilot system delivers seamless automated features that remove the stress from driving.

Intelligent Interior

Staying true to its GT DNA, the HiPhi Z sports a luxurious interior that delivers effortless comfort for both the driver and passengers.

The cabin includes suede bucket seats, ambient panel lights and a 23-speaker audio system for an immersive sensory environment.

It’s also intelligent, with the HiPhi Bot AI companion that can automatically adjust aspects of the vehicle experience. The AI assistant uses a vehicle-grade, adjustable, high-speed motion robotic arm to interact with passengers. It can move back and forth in less than a second, with control accuracy of up to 0.001 millimeters, performing a variety of delicate movements seamlessly.

The GT HiPhi Z is currently on display in Shenzen, China, and will tour nearly a dozen other cities. Human Horizons plans to release details of the full launch at the Chengdu Auto Show in August.

The post Grand Entrance: Human Horizons Unveils Smart GT Built on NVIDIA DRIVE Orin appeared first on NVIDIA Blog.

Merge Ahead: Researcher Takes Software Bridge to Quantum Computing

Kristel Michielsen was into quantum computing before quantum computing was cool.

The computational physicist simulated quantum computers as part of her Ph.D. work in the Netherlands in the early 1990s.

Today, she manages one of Europe’s largest facilities for quantum computing, the Jülich Unified Infrastructure for Quantum Computing (JUNIQ) . Her mission is to help developers pioneer this new realm with tools like NVIDIA Quantum Optimized Device Architecture (QODA).

“This helps bring quantum computing closer to the HPC and AI communities.” -Kristel Michielsen

“We can’t go on with today’s classical computers alone because they consume so much energy, and they can’t solve some problems,” said Michielsen, who leads the quantum program at the Jülich Supercomputing Center near Cologne. “But paired with quantum computers that won’t consume as much energy, I believe there may be the potential to solve some of our most complex problems.”

Enter the QPU

Because quantum processors, or QPUs, harness the properties of quantum mechanics, they’re ideally suited to simulating processes at the atomic level. That could enable fundamental advances in chemistry and materials science, starting domino effects in everything from more efficient batteries to more effective drugs.

QPUs may also help with thorny optimization problems in fields like logistics. For example, airlines face daily challenges figuring out which planes to assign to which routes.

In one experiment, a quantum computer recently installed at Jülich showed the most efficient way to route nearly 500 flights — demonstrating the technology’s potential.

Quantum computing also promises to take AI to the next level. In separate experiments, Jülich researchers used quantum machine learning to simulate how proteins bind to DNA strands and classify satellite images of Lyon, France.

Hybrids Take Best of Both Worlds

Several prototype quantum computers are now available, but none is powerful or dependable enough to tackle commercially relevant jobs yet. But researchers see a way forward.

“For a long time, we’ve had a vision of hybrid systems as the only way to get practical quantum computing — linked to today’s classical HPC systems, quantum computers will give us the best of both worlds,” Michielsen said.

And that’s just what Jülich and other researchers around the world are building today.

Quantum Gets 49x Boost on A100 GPUs

In addition to its current analog quantum system, Jülich plans next year to install a neutral atom quantum computer from Paris-based Pasqal. It’s also been running quantum simulations on classical systems such as its JUWELS Booster, which packs over 3,700 NVIDIA A100 Tensor Core GPUs.

“The GPU version of our universal quantum-computer simulator, called JUQCS, has given us up to 49x speedups compared to jobs running on CPU clusters — this work uses almost all the system’s GPU nodes and relies heavily on its InfiniBand network,” she said, citing a recent paper.

Recently, classical systems like the JUWELS Booster use NVIDIA cuQuantum, a software development kit for accelerating quantum jobs on GPUs. “For us, it’s great for cross-platform benchmarking, and for others it could be a great tool to start or optimize their quantum simulation codes,” Michielsen said of the SDK.

A100 GPUs (green) form the core of the JUWELS Booster that can simulate quantum jobs with the NVIDIA cuQuantum SDK.

Hybrid Systems, Hybrid Software

With multiple HPC and quantum systems on hand and more on the way for Jülich and other research centers, one of the challenges is tying it all together.

“The HPC community needs to look in detail at applications that span everything from climate science and medicine to chemistry and physics to see what parts of the code can run on quantum systems,” she said.

It’s a Herculean task for developers entering the quantum computing era, but help’s on the way.

NVIDIA QODA acts like a software bridge. With a function call, developers can choose to run their quantum jobs on GPUs or quantum processors.

QODA’s high-level language will support every kind of quantum computer, and its compiler will be available as open-source software. And it’s supported by quantum system and software providers including Pasqal, Xanadu, QC Ware and Zapata.

Quantum Leap for HPC, AI Developers

Michielsen foresees JUNIQ providing QODA to researchers across Europe who use its quantum services.

“This helps bring quantum computing closer to the HPC and AI communities,” she said. “It will speed up how they get things done without them needing to do all the low-level programming, so it makes their life much easier.”

Michielsen expects many researchers will be using QODA to try out hybrid quantum-classical computers — over the coming year and beyond.

“Who knows, maybe one of our users will pioneer a new example of real-world hybrid computing,” she said.

Image at top courtesy of Forschungszentrum Jülich / Ralf-Uwe Limbach

The post Merge Ahead: Researcher Takes Software Bridge to Quantum Computing appeared first on NVIDIA Blog.

Sequences That Stun: Visual Effects Artist Surfaced Studio Arrives ‘In the NVIDIA Studio’

Editor’s note: This post is part of our weekly In the NVIDIA Studio series, which celebrates featured artists, offers creative tips and tricks, and demonstrates how NVIDIA Studio technology accelerates creative workflows. 

Visual effects savant Surfaced Studio steps In the NVIDIA Studio this week to share his clever film sequences, Fluid Simulation and Destruction, as well as his creative workflows.

 

These sequences feature quirky visual effects that Surfaced Studio is renowned for demonstrating on his YouTube channel.

 

Surfaced Studio’s successful tutorial style, dubbed “edutainment,” features his wonderfully light-hearted personality — providing vital techniques and creative insights which lead to fun, memorable learning for his subscribers.

“I’ve come to accept my nerdiness — how my lame humor litters my tutorials — and I’ve been incredibly fortunate that I ended up finding a community of like-minded people who are happy to learn alongside my own attempts of making art,” Surfaced Studio mused.

The Liquification Situation

To create the Fluid Simulation sequence, Surfaced Studio began in Adobe After Effects by combining two video clips: one of him pretending to be hit by a fluid wave, and another of his friend Jimmy running towards him. Jimmy magically disappears because of the masked layer that Surfaced Studio applied.

Video playback within Blender.

He then rendered the clip and imported it into Blender. This served as a reference to match the 3D scene geometry with the fluid simulation.

The fluid simulation built with Mantaflow collides with Surfaced Studio.

Surfaced Studio then selected the Mantaflow fluid feature, tweaking parameters to create the fluid simulation. For a beginner’s look at fluid simulation techniques, check out his tutorial, FLUID SIMULATIONS in Blender 2.9 with Mantaflow. This feature, accelerated by his GeForce RTX 2070 Laptop GPU, bakes simulations faster than with a CPU alone.

 

To capture a collision with accurate, realistic physics, Surfaced Studio set up rigid body objects, creating the physical geometry for the fluid to collide with. The Jimmy character was marked with the Use Flow property to emit the fluid at the exact moment of the collision.

Speed vectors unlocked the motion blur effect accelerated by Surfaced Studio’s GPU.

“It’s hard not to recommend NVIDIA GPUs for anyone wanting to explore the creative space, and I’ve been using them for well over a decade now,” said Surfaced Studio. 

Surfaced Studio also enabled speed vectors to implement motion blur effects directly on the fluid simulation, adding further realism to the short.

His entire 3D creative workflow in Blender was accelerated by the RTX 2070 Laptop GPU: the fluid simulation, motion blur effects, animations and mesh generation. Blender Cycles RTX-accelerated OptiX ray tracing unlocked quick interactive modeling in the viewport and lightning-fast final renders. Surfaced Studio said his GPU saved him countless hours to reinvest in his creativity.

Take note of the multiple layers needed to bring 3D animations to life.

Surfaced Studio reached the composite stage in After Effects, applying the GPU-accelerated Curves effect to the water, shaping and illuminating it to his liking.

He then used the Boris FX Mocha AE plugin to rotoscope Jimmy — or create sequences by tracing over live-action footage frame by frame — to animate the character. This can be a lengthy process, but the GPU-accelerated plugin completed the task in mere moments.

Color touchups were applied with the Hue/Saturation, Brightness and Color Balance features, which are also GPU accelerated.

Finally, Surfaced Studio used the GPU-accelerated NVENC encoder to rapidly export his final video files.

For a deeper dive into Surfaced Studio’s process, watch his tutorial: Add 3D Fluid Simulations to Videos w/ Blender & After Effects.

“A lot of the third-party plugins that I use regularly, including Boris FX Mocha Pro, Continuum, Sapphire, Video Copilot Element 3D and Red Giant, all benefit heavily from GPU acceleration,” the artist said.

His GeForce RTX 2070 Laptop GPU worked overtime with this project — but the Fluid Simulation sequence only scratches the surface(d) of the artist’s skills.

Fire in the Hole!

Surfaced Studio built the short sequence Destruction following a similar creative workflow to Fluid Simulation. 3D scenes in Blender complemented video footage composited in After Effects, with realistic physics applied.

Destruction in Blender for Absolute Beginners covers the basics of how to break objects in Blender, add realistic physics to objects, calculate physics weight for fragments, and animate entire scenes.

3D Destruction Effects in Blender & After Effects offers tips and tricks for further compositing in After Effects, placing 2D stock footage in 3D elements, final color grading and camera-shaking techniques.

“Edutainment” at its finest.

These tools set the foundation for aspiring 3D artists to build their own destructive scenes — and the “edutainment” is highly recommended viewing.

Visual effects artist Surfaced Studio has worked with Intel, GIGABYTE, Boris FX, FX Home (HitFilm) and Gudsen.

Follow NVIDIA Studio on Instagram, Twitter and Facebook. Access tutorials on the Studio YouTube channel and get updates directly in your inbox by subscribing to the NVIDIA Studio newsletter.

The post Sequences That Stun: Visual Effects Artist Surfaced Studio Arrives ‘In the NVIDIA Studio’ appeared first on NVIDIA Blog.

AI on the Sky: Stunning New Images from the James Webb Space Telescope to be Analyzed by, Train, AI

The unveiling by U.S. President Joe Biden Monday of the first full-color image from the James Webb Space Telescope is already astounding — and delighting — humans around the globe.

“We can see possibilities nobody has ever seen before, we can go places nobody has ever gone before,” Biden said during a White House press event. “These images are going to remind the world that America can do big things.”

But humans aren’t the only audience for these images. Data from what Biden described as the “miraculous telescope” are also being soaked up by a new generation of GPU-accelerated AI created at the University of California, Santa Cruz.

And Morpheus, as the team at UC Santa Cruz has dubbed the AI, won’t just be helping humans make sense of what we’re seeing. It will also use images from the $10 billion space telescope to better understand what it’s looking for.

The image released by the National Aeronautics and Space Administration Monday represents the deepest and sharpest infrared images of the distant universe to date. Dubbed “Webb’s First Deep Field,” the image of galaxy cluster SMACS 0723 is overflowing with detail.

Answering Questions

NASA reported that thousands of galaxies — including the faintest objects ever observed in the infrared — have appeared in Webb’s view for the first time.

And Monday’s image represents just a tiny piece of what’s out there, with the image covering a patch of sky roughly the size of a grain of sand held at arm’s length by someone on the ground, explained NASA Administrator Bill Nelson.

The telescope’s iconic array of 18 interlocking hexagonal mirrors, which span a total of 21 feet 4 inches, are peering far deeper into the universe and deeper into the universe’s past than any tool to date.

“We are going to be able to answer questions that we don’t even know what the questions are yet,” Nelson said.

Strange New Worlds

U.S. President Joe Biden unveiled the first image from the $10 billion James Webb Space Telescope Monday. It shows galaxy cluster SMACS 0723 as it appeared 4.6 billion years ago. The combined mass of this galaxy cluster acts as a gravitational lens, magnifying much more distant galaxies behind it.

The telescope won’t just see back further in time than any scientific instrument — almost to the beginning of the universe — it may also help us see if planets outside our solar system are habitable, Nelson said.

Morpheus — which played a key role in helping scientists understand images taken on NASA’s Hubble Space Telescope — will help scientists ask, and answer, these questions.

Working with Ryan Hausen, a Ph.D. student in UC Santa Cruz’s computer science department, UC Santa Cruz Astronomy and Astrophysics Professor Brant Robertson helped create a deep learning framework that classifies astronomical objects, such as galaxies, based on the raw data streaming out of telescopes on a pixel-by-pixel basis.

“The JWST will really enable us to see the universe in a new way that we’ve never seen before,” said Robertson said. “So it’s really exciting.”

Eventually, Morpheus will also be using the images to learn, too. Not only are the JWST’s optics unique, but JWST will also be collecting light galaxies that are further away — and thus redder — than were visible on the Hubble.

Morpheus is trained on UC Santa Cruz’s Lux supercomputer. The machine includes 28 GPU nodes with two NVIDIA V100 GPUs each.

In other words, while we’ll all feasting our eyes on these images for years to come, scientists will be feeding data from the JWST to AI.

Tune in: NASA and its partners will release the full series of Webb’s first full-color images and data, known as spectra, Tuesday, July 12, during a live NASA TV broadcast.

The AI Podcast · Astrophysicist Brant Robertson Using AI to Glean Insights from James Webb Space Telescope – Ep. 171

The post AI on the Sky: Stunning New Images from the James Webb Space Telescope to be Analyzed by, Train, AI appeared first on NVIDIA Blog.

Windfall: Omniverse Accelerates Turning Wind Power Into Clean Hydrogen Fuel

Engineers are using the NVIDIA Omniverse 3D simulation platform as part of a proof of concept that promises to become a model for putting green energy to work around the world.

Dubbed Gigastack, the pilot project — led by a consortium that includes Phillips 66 and Denmark-based renewable energy company Ørsted — will create low-emission fuel for the energy company’s Humber refinery in England.

Hydrogen is expected to play a critical role as the world moves to reduce its dependence on fossil fuels over the coming years. The market for hydrogen fuel is predicted to grow over 45x to $90 billion by 2030, up from $1.8 billion today.

The Gigastack project aims to showcase how green energy can be woven into complex, industrial energy infrastructure on a massive scale and accelerate net-zero emissions progress.

To make that happen, new kinds of collaboration are vital, explained Ahsan Yousufzai, global head of business development for energy surface at NVIDIA, during a conversation about the project in an on-demand panel discussion at NVIDIA GTC.

“To meet global sustainability targets, the entire energy ecosystem needs to work together,” Yousufzai said. “For that, technologies like AI and digital twins will play a major role.”

The system — now in the planning stages — will draw power from Ørsted’s massive Hornsea 1,218-megawatt offshore wind farm, the largest in the world upon its completion in January last year.

Hornsea will be connected to ITM Power’s Gigastack electrolyzer facility, which will use electrolysis to turn water into clean, renewable hydrogen fuel.

That fuel, in turn, will be put to work at Phillips 66’s Humber refinery, decarbonizing one of the U.K.’s largest industrial facilities.

The project is unique because of its scale — with plans to eventually ramp up Gigastack into a massive 1-gigawatt electrolyzer system — and because of its potential to become a blueprint for deploying electrolyzer technology for wider decarbonization.

Weaving all these elements together, however, requires tight collaboration between team members from Element Energy, ITM Power, Ørsted, Phillips 66 and Worley.

Worley — one of the largest global providers of engineering and professional services to the oil, gas, mining, power and infrastructure industries — turned to Aspen Technology’s Aspen OptiPlant, sophisticated software that’s a workhouse for planning and optimizing some of the world’s most complex infrastructure.

“When you have a finite amount of money to be spent, you want to maximize the number of options on how facilities can be designed, fabricated and constructed,” explained Vishal Mehta, senior vice president at Worley.

“This is the importance of rapid optioneering, where you’re able to run AI models and engines with not only mathematics but also visual representation,” Mehta said. “People can come up with ideas and, in real time, move them around with mathematical equations changing in the background.”

Worley relied on AspenTech’s OptiPlant to develop a 3D conceptual layout of the Gigastack green hydrogen project. The industrial optimization software combines decades of process modeling expertise with cutting-edge AI and machine learning.

The next step: connecting OptiPlant’s sophisticated physics-based plant piping and layout capabilities to build a 3D conceptual layout of the plant with Omniverse, potentially allowing teams to work together on plant design in real-time — connecting their various 3D software tools, datasets and teams together.

“With a traditional model review, it’s one person leading the way, but here we have this opportunity for everybody to be immersed in the facility,” said Sonali Singh, vice president of product management for performance engineering at AspenTech. “They can really all collaborate by looking at their individual priorities.”

Omniverse can be the platform on which they further build their digital twin of the growing facility, enabling connection of simulation data and AIs, capturing knowledge from human and AI collaborators working on the project and bringing intelligent optimization.

To learn more, watch the on-demand GTC session and explore the Gigastack project.

Find out how Siemens Gamesa and Zenotech are accelerating offshore wind farm simulations with NVIDIA’s full-stack technologies.

The post Windfall: Omniverse Accelerates Turning Wind Power Into Clean Hydrogen Fuel appeared first on NVIDIA Blog.

No Fueling Around: Designers Collaborate in Extended Reality on Porsche Electric Race Car

A one-of-a-kind electric race car revved to life before it was manufactured — or even prototyped — thanks to GPU-powered extended reality technology.

At the Automotive Innovation Forum in May, NVIDIA worked with Autodesk VRED to showcase a photorealistic Porsche electric sports car in augmented reality, with multiple attendees collaborating in the same immersive environment.

The demo delivered a life-size digital twin of the Porsche Mission R in AR and VR, which are collectively known as extended reality, or XR. Using NVIDIA CloudXR, Varjo XR-3 headsets and Lenovo Android tablets, audiences saw the virtual Porsche with photorealistic lighting and shadows.

All images courtesy of Autodesk.

Audiences could view the virtual race car side by side with a physical car on site. With this direct comparison, they witnessed the photorealistic nature of the AR model — from the color of the metals, to the surface of the tires, to the environmental lighting.

The stunning demo, which was shown through an Autodesk VRED collaborative session, ran on NVIDIA RTX-based virtual workstations.

There were two ways to view the demo. First, NVIDIA CloudXR streamed the experience to the tablets from a virtualized NVIDIA Project Aurora server, which was powered by NVIDIA A40 GPUs on a Lenovo ThinkStation SR670 Server. Attendees could also use Varjo headsets, which were locally tethered to NVIDIA RTX A6000 GPUs running on a Lenovo ThinkStation P620 workstation.

Powerful XR Technologies Behind the Streams

Up to five users at a time entered the scene, with two users wearing headsets to see the Porsche car in mixed reality, and three users on tablets to view the car in AR. Users were represented as avatars in the session.

With NVIDIA CloudXR, the forum attendees remotely streamed the photorealistic Porsche model. Built on NVIDIA RTX technology, CloudXR extends NVIDIA RTX Virtual Workstation software, which enables users to stream fully accelerated immersive graphics from a virtualized environment.

This demo used a virtualized Lenovo ThinkStation SR670 server to power NVIDIA’s Project Aurora — a software and hardware platform for XR streaming at the edge. Project Aurora delivers the horsepower of NVIDIA RTX A40 GPUs, so users could experience the rich, real-time graphics of the Porsche model from a machine room over a private 5G network.

Through server-based streaming with Project Aurora, multiple users from different locations were brought together to experience the demo in a single immersive environment. With the help of U.K.-based integrator The Grid Factory, Project Aurora is now available to be deployed in any enterprise.

Learn more about advanced XR streaming with NVIDIA CloudXR.

 

The post No Fueling Around: Designers Collaborate in Extended Reality on Porsche Electric Race Car appeared first on NVIDIA Blog.