Meet the Omnivore: Animator Entertains and Explains With NVIDIA Omniverse

Editor’s note: This post is a part of our Meet the Omnivore series, which features individual creators and developers who use NVIDIA Omniverse to accelerate their 3D workflows and create virtual worlds.

Australian animator Marko Matosevic is taking dad jokes and breathing them into animated life with NVIDIA Omniverse, a virtual world simulation and collaboration platform for 3D workflows.

Matosevic’s work is multifaceted: he’s a VR developer by day and a YouTuber by night, a lover of filmmaking and animation, and has a soft spot for sci-fi and dad jokes.

The animated shorts on Matosevic’s YouTube channels Markom3D and Deadset Digital are the culmination of those varied interests and pursuits.

To bring the above film to life, Matosevic harnessed Reallusion iClone and Character Creator for character creation, Perception Neuron V3 for body motion capture, and NVIDIA Omniverse and Epic Games Unreal Engine 5 for rendering.

After noting a lack of YouTube tutorials on how to use animation software like Blender, Matosevic also set himself up as an instructor. He says his goal is to help those at all developmental ranges — from beginners to advanced users — learn new skills and techniques in a concise manner. The following video is a tutorial of the previous film:

Matosevic says his ultimate goal in creating these animated shorts is to make his viewers “have a laugh.”

“Through rough times that we are going through at the moment, it is nice to just let yourself go for a few moments and enjoy a short animation,” he said. “Sure, the jokes may be terrible, and a lot of people groan, but I am a dad, and that is part of my responsibility.”

Optimized Rendering and Seamless Workflow

Initially, Matosevic relied primarily on Blender for 3D modeling and Unreal Engine 4 for his work. It wasn’t until he upgraded to an NVIDIA RTX 3080 Ti GPU that he saw the possibility of integrating the NVIDIA Omniverse platform into his toolbox.

“What really got me interested was that NVIDIA [Omniverse] had its own render engine, which I assumed that my 3080 would be optimized for,” he said. “I was able to create an amazing scene with little effort.”

With NVIDIA Omniverse, Matosevic can export whole scenes from Unreal Engine into Omniverse without having to deal with complex shaders, as he would’ve had to do if he were solely working in Blender.

Along with the iClone and Blender connectors, Matosevic uses NVIDIA Omniverse Machinima, an application that allows content creators to collaborate in real time to animate and manipulate characters along with their environments inside of virtual worlds.

“I like it because with a few simple clicks, I can start the rendering process and know that I am going to have something amazing when it is finished,” he said.

With Universal Scene Description, an open-source 3D scene description for creating virtual worlds, these applications and connectors work seamlessly together to bring elevated results.

“I have created animated short films using Blender and Unreal Engine 4, but Omniverse has just raised the quality to a new level,” he said.

Join In on the Creation

Creators across the world can download NVIDIA Omniverse for free and Enterprise teams can use the platform for their 3D projects.

Check out works from other artists using NVIDIA Omniverse and submit your own work with #MadeInOmniverse to be featured in the gallery.

Join us at SIGGRAPH 2022 to learn how Omniverse, and design and visualization solutions, are driving advanced breakthroughs in graphics workflows and GPU-accelerated software.

Connect your workflows to NVIDIA Omniverse with software from Adobe, Autodesk, Epic Games, Maxon, Reallusion and more.

Follow NVIDIA Omniverse on Instagram, Twitter, YouTube and Medium for additional resources and inspiration. Check out the Omniverse forums and join our Discord Server to chat with the community.

The post Meet the Omnivore: Animator Entertains and Explains With NVIDIA Omniverse appeared first on NVIDIA Blog.

Action on Repeat: GFN Thursday Brings Loopmancer With RTX ON to the Cloud

Investigate the ultimate truth this GFN Thursday with Loopmancer, now streaming to all members on GeForce NOW. Stuck in a death loop, RTX 3080 and Priority members can search for the truth with RTX ON — including NVIDIA DLSS and ray-traced reflections.

Plus, players can enjoy the latest Genshin Impact event with the “Summer Fantasia” version 2.8 update. It’s all part of the nine new games joining the GeForce NOW library this week.

Enter the Dragon City

The cycle continues until the case is solved. Loopmancer is streaming on GeForce NOW, with RTX ON.

Playing as a detective in this roguelite-platformer action game, members will wake back up in their apartments each time they die, bathed in the neon lights of futuristic Dragon City. As the story progresses, reviewing what seemed like the correct choice in the past may lead to a different conclusion.

Face vicious gangsters, well-equipped mercs, crazy mutants, highly trained bionics and more while searching for clues, even on mobile devices. Unlock new weapons and abilities through endless reincarnations to enhance your fighting skills for fast-paced battles.

RTX 3080 and Priority members can experience Loopmancer with DLSS for improved image quality at higher frame rates, as well as real-time ray-tracing technology that simulates the realistic, physical behavior of light, even on underpowered devices and Macs. Every loop, detail and map – from the richly colored Dragon Town to the gloomy Shuigou Village – is rendered with beautiful cinematic quality.

Ready to initiate a new loop? Try out the Loopmancer demo in the Instant Play Free Demos row before diving into the full game. RTX 3080 and Priority members can even try the demo with RTX ON.

A Summertime Odyssey Awaits

With a cursed blade of unknown origin, a mysterious unsolved case and the familiar — but not too familiar — islands far at sea, the recent addition of Genshin Impact heats up with the version 2.8 “Summer Fantasia” update, now available.

Meet the newest Genshin character, Shikanoin Heizou, a young prodigy detective from the Tenryou Commission with sharp senses. Members can also cool off with the new sea-based “Summertime Odyssey” main event, explore the Golden Apple Archipelago, experience new stories and dress their best with new outfits.

RTX 3080 members can stream all of the fun at 4K resolution and 60 frames per second, or 1440p and 120 FPS from the PC and Mac native apps. They also get the perks of ultra-low latency that rivals console gaming and can catch all of the newest action with the maximized eight-hour play sessions.

Summer Gamin’, Havin’ a Blast

 Fight through dystopian cyberspace and establish an exotic black market store in this rogue-lite, management, shoot ‘em up.

This week brings in a total of nine new titles for gamers to play.

Neon Blight (New release on Steam and Epic Games Store)
Loopmancer (New release on Steam)
Stones Keeper: King Aurelius (New release on Steam, July 14)
Wonder Boy: The Dragon’s Trap (New release free on Epic Games Store, July 14)
Dead Age 2 (Epic Games Store)
Huntdown (Steam)
Out There: Oceans of Time (Epic Games Store)
Titan Quest Anniversary Edition (Epic Games Store)
The Guild 3 (Epic Games Store)

With all of these awesome options to play and only so many hours in a day, we’ve got a question for you. Let us know your answer on Twitter or in the comments below.

You wake up tomorrow to relive today. How much more hours of gaming will you get in?

— NVIDIA GeForce NOW (@NVIDIAGFN) July 13, 2022

The post Action on Repeat: GFN Thursday Brings Loopmancer With RTX ON to the Cloud appeared first on NVIDIA Blog.

Grand Entrance: Human Horizons Unveils Smart GT Built on NVIDIA DRIVE Orin

Tourer vehicles just became a little more grand.

Electric vehicle maker Human Horizons provided a detailed glimpse earlier this month of its latest production model: the GT HiPhi Z. The intelligent EV is poised to redefine the grand tourer vehicle category with innovative, software-defined capabilities that bring luxurious cruising to the next level.

The vehicle’s marquis features include an in-vehicle AI assistant and autonomous driving system powered by NVIDIA DRIVE Orin.

The GT badge first appeared on vehicles in the mid-20th century, combining smooth performance with a roomy interior for longer joy rides. Since then, the segment has diversified, with varied takes on horsepower and body design.

The HiPhi Z further iterates on the vehicle type, emphasizing smart performance and a convenient, comfortable in-cabin experience.

Smooth Sailing

An EV designed to be driven, the GT HiPhi Z also incorporates robust advanced driver assistance features that can give humans a break on longer trips.

The HiPhi Pilot ADAS platform provides dual redundancy for computing, perception, communication, braking, steering and power supply. It uses the high-performance AI compute of NVIDIA DRIVE Orin and 34 sensors to perform assisted driving and parking, as well as smart summon.

DRIVE Orin is designed to handle the large number of applications and deep neural networks running simultaneously for autonomous driving capabilities. It’s architected to achieve systematic safety standards such as the ISO 26262 ASIL-D.

With this high level of performance at its core, the HiPhi Pilot system delivers seamless automated features that remove the stress from driving.

Intelligent Interior

Staying true to its GT DNA, the HiPhi Z sports a luxurious interior that delivers effortless comfort for both the driver and passengers.

The cabin includes suede bucket seats, ambient panel lights and a 23-speaker audio system for an immersive sensory environment.

It’s also intelligent, with the HiPhi Bot AI companion that can automatically adjust aspects of the vehicle experience. The AI assistant uses a vehicle-grade, adjustable, high-speed motion robotic arm to interact with passengers. It can move back and forth in less than a second, with control accuracy of up to 0.001 millimeters, performing a variety of delicate movements seamlessly.

The GT HiPhi Z is currently on display in Shenzen, China, and will tour nearly a dozen other cities. Human Horizons plans to release details of the full launch at the Chengdu Auto Show in August.

The post Grand Entrance: Human Horizons Unveils Smart GT Built on NVIDIA DRIVE Orin appeared first on NVIDIA Blog.

Merge Ahead: Researcher Takes Software Bridge to Quantum Computing

Kristel Michielsen was into quantum computing before quantum computing was cool.

The computational physicist simulated quantum computers as part of her Ph.D. work in the Netherlands in the early 1990s.

Today, she manages one of Europe’s largest facilities for quantum computing, the Jülich Unified Infrastructure for Quantum Computing (JUNIQ) . Her mission is to help developers pioneer this new realm with tools like NVIDIA Quantum Optimized Device Architecture (QODA).

“This helps bring quantum computing closer to the HPC and AI communities.” -Kristel Michielsen

“We can’t go on with today’s classical computers alone because they consume so much energy, and they can’t solve some problems,” said Michielsen, who leads the quantum program at the Jülich Supercomputing Center near Cologne. “But paired with quantum computers that won’t consume as much energy, I believe there may be the potential to solve some of our most complex problems.”

Enter the QPU

Because quantum processors, or QPUs, harness the properties of quantum mechanics, they’re ideally suited to simulating processes at the atomic level. That could enable fundamental advances in chemistry and materials science, starting domino effects in everything from more efficient batteries to more effective drugs.

QPUs may also help with thorny optimization problems in fields like logistics. For example, airlines face daily challenges figuring out which planes to assign to which routes.

In one experiment, a quantum computer recently installed at Jülich showed the most efficient way to route nearly 500 flights — demonstrating the technology’s potential.

Quantum computing also promises to take AI to the next level. In separate experiments, Jülich researchers used quantum machine learning to simulate how proteins bind to DNA strands and classify satellite images of Lyon, France.

Hybrids Take Best of Both Worlds

Several prototype quantum computers are now available, but none is powerful or dependable enough to tackle commercially relevant jobs yet. But researchers see a way forward.

“For a long time, we’ve had a vision of hybrid systems as the only way to get practical quantum computing — linked to today’s classical HPC systems, quantum computers will give us the best of both worlds,” Michielsen said.

And that’s just what Jülich and other researchers around the world are building today.

Quantum Gets 49x Boost on A100 GPUs

In addition to its current analog quantum system, Jülich plans next year to install a neutral atom quantum computer from Paris-based Pasqal. It’s also been running quantum simulations on classical systems such as its JUWELS Booster, which packs over 3,700 NVIDIA A100 Tensor Core GPUs.

“The GPU version of our universal quantum-computer simulator, called JUQCS, has given us up to 49x speedups compared to jobs running on CPU clusters — this work uses almost all the system’s GPU nodes and relies heavily on its InfiniBand network,” she said, citing a recent paper.

Recently, classical systems like the JUWELS Booster use NVIDIA cuQuantum, a software development kit for accelerating quantum jobs on GPUs. “For us, it’s great for cross-platform benchmarking, and for others it could be a great tool to start or optimize their quantum simulation codes,” Michielsen said of the SDK.

A100 GPUs (green) form the core of the JUWELS Booster that can simulate quantum jobs with the NVIDIA cuQuantum SDK.

Hybrid Systems, Hybrid Software

With multiple HPC and quantum systems on hand and more on the way for Jülich and other research centers, one of the challenges is tying it all together.

“The HPC community needs to look in detail at applications that span everything from climate science and medicine to chemistry and physics to see what parts of the code can run on quantum systems,” she said.

It’s a Herculean task for developers entering the quantum computing era, but help’s on the way.

NVIDIA QODA acts like a software bridge. With a function call, developers can choose to run their quantum jobs on GPUs or quantum processors.

QODA’s high-level language will support every kind of quantum computer, and its compiler will be available as open-source software. And it’s supported by quantum system and software providers including Pasqal, Xanadu, QC Ware and Zapata.

Quantum Leap for HPC, AI Developers

Michielsen foresees JUNIQ providing QODA to researchers across Europe who use its quantum services.

“This helps bring quantum computing closer to the HPC and AI communities,” she said. “It will speed up how they get things done without them needing to do all the low-level programming, so it makes their life much easier.”

Michielsen expects many researchers will be using QODA to try out hybrid quantum-classical computers — over the coming year and beyond.

“Who knows, maybe one of our users will pioneer a new example of real-world hybrid computing,” she said.

Image at top courtesy of Forschungszentrum Jülich / Ralf-Uwe Limbach

The post Merge Ahead: Researcher Takes Software Bridge to Quantum Computing appeared first on NVIDIA Blog.

Sequences That Stun: Visual Effects Artist Surfaced Studio Arrives ‘In the NVIDIA Studio’

Editor’s note: This post is part of our weekly In the NVIDIA Studio series, which celebrates featured artists, offers creative tips and tricks, and demonstrates how NVIDIA Studio technology accelerates creative workflows. 

Visual effects savant Surfaced Studio steps In the NVIDIA Studio this week to share his clever film sequences, Fluid Simulation and Destruction, as well as his creative workflows.

 

These sequences feature quirky visual effects that Surfaced Studio is renowned for demonstrating on his YouTube channel.

 

Surfaced Studio’s successful tutorial style, dubbed “edutainment,” features his wonderfully light-hearted personality — providing vital techniques and creative insights which lead to fun, memorable learning for his subscribers.

“I’ve come to accept my nerdiness — how my lame humor litters my tutorials — and I’ve been incredibly fortunate that I ended up finding a community of like-minded people who are happy to learn alongside my own attempts of making art,” Surfaced Studio mused.

The Liquification Situation

To create the Fluid Simulation sequence, Surfaced Studio began in Adobe After Effects by combining two video clips: one of him pretending to be hit by a fluid wave, and another of his friend Jimmy running towards him. Jimmy magically disappears because of the masked layer that Surfaced Studio applied.

Video playback within Blender.

He then rendered the clip and imported it into Blender. This served as a reference to match the 3D scene geometry with the fluid simulation.

The fluid simulation built with Mantaflow collides with Surfaced Studio.

Surfaced Studio then selected the Mantaflow fluid feature, tweaking parameters to create the fluid simulation. For a beginner’s look at fluid simulation techniques, check out his tutorial, FLUID SIMULATIONS in Blender 2.9 with Mantaflow. This feature, accelerated by his GeForce RTX 2070 Laptop GPU, bakes simulations faster than with a CPU alone.

 

To capture a collision with accurate, realistic physics, Surfaced Studio set up rigid body objects, creating the physical geometry for the fluid to collide with. The Jimmy character was marked with the Use Flow property to emit the fluid at the exact moment of the collision.

Speed vectors unlocked the motion blur effect accelerated by Surfaced Studio’s GPU.

“It’s hard not to recommend NVIDIA GPUs for anyone wanting to explore the creative space, and I’ve been using them for well over a decade now,” said Surfaced Studio. 

Surfaced Studio also enabled speed vectors to implement motion blur effects directly on the fluid simulation, adding further realism to the short.

His entire 3D creative workflow in Blender was accelerated by the RTX 2070 Laptop GPU: the fluid simulation, motion blur effects, animations and mesh generation. Blender Cycles RTX-accelerated OptiX ray tracing unlocked quick interactive modeling in the viewport and lightning-fast final renders. Surfaced Studio said his GPU saved him countless hours to reinvest in his creativity.

Take note of the multiple layers needed to bring 3D animations to life.

Surfaced Studio reached the composite stage in After Effects, applying the GPU-accelerated Curves effect to the water, shaping and illuminating it to his liking.

He then used the Boris FX Mocha AE plugin to rotoscope Jimmy — or create sequences by tracing over live-action footage frame by frame — to animate the character. This can be a lengthy process, but the GPU-accelerated plugin completed the task in mere moments.

Color touchups were applied with the Hue/Saturation, Brightness and Color Balance features, which are also GPU accelerated.

Finally, Surfaced Studio used the GPU-accelerated NVENC encoder to rapidly export his final video files.

For a deeper dive into Surfaced Studio’s process, watch his tutorial: Add 3D Fluid Simulations to Videos w/ Blender & After Effects.

“A lot of the third-party plugins that I use regularly, including Boris FX Mocha Pro, Continuum, Sapphire, Video Copilot Element 3D and Red Giant, all benefit heavily from GPU acceleration,” the artist said.

His GeForce RTX 2070 Laptop GPU worked overtime with this project — but the Fluid Simulation sequence only scratches the surface(d) of the artist’s skills.

Fire in the Hole!

Surfaced Studio built the short sequence Destruction following a similar creative workflow to Fluid Simulation. 3D scenes in Blender complemented video footage composited in After Effects, with realistic physics applied.

Destruction in Blender for Absolute Beginners covers the basics of how to break objects in Blender, add realistic physics to objects, calculate physics weight for fragments, and animate entire scenes.

3D Destruction Effects in Blender & After Effects offers tips and tricks for further compositing in After Effects, placing 2D stock footage in 3D elements, final color grading and camera-shaking techniques.

“Edutainment” at its finest.

These tools set the foundation for aspiring 3D artists to build their own destructive scenes — and the “edutainment” is highly recommended viewing.

Visual effects artist Surfaced Studio has worked with Intel, GIGABYTE, Boris FX, FX Home (HitFilm) and Gudsen.

Follow NVIDIA Studio on Instagram, Twitter and Facebook. Access tutorials on the Studio YouTube channel and get updates directly in your inbox by subscribing to the NVIDIA Studio newsletter.

The post Sequences That Stun: Visual Effects Artist Surfaced Studio Arrives ‘In the NVIDIA Studio’ appeared first on NVIDIA Blog.

AI on the Sky: Stunning New Images from the James Webb Space Telescope to be Analyzed by, Train, AI

The unveiling by U.S. President Joe Biden Monday of the first full-color image from the James Webb Space Telescope is already astounding — and delighting — humans around the globe.

“We can see possibilities nobody has ever seen before, we can go places nobody has ever gone before,” Biden said during a White House press event. “These images are going to remind the world that America can do big things.”

But humans aren’t the only audience for these images. Data from what Biden described as the “miraculous telescope” are also being soaked up by a new generation of GPU-accelerated AI created at the University of California, Santa Cruz.

And Morpheus, as the team at UC Santa Cruz has dubbed the AI, won’t just be helping humans make sense of what we’re seeing. It will also use images from the $10 billion space telescope to better understand what it’s looking for.

The image released by the National Aeronautics and Space Administration Monday represents the deepest and sharpest infrared images of the distant universe to date. Dubbed “Webb’s First Deep Field,” the image of galaxy cluster SMACS 0723 is overflowing with detail.

Answering Questions

NASA reported that thousands of galaxies — including the faintest objects ever observed in the infrared — have appeared in Webb’s view for the first time.

And Monday’s image represents just a tiny piece of what’s out there, with the image covering a patch of sky roughly the size of a grain of sand held at arm’s length by someone on the ground, explained NASA Administrator Bill Nelson.

The telescope’s iconic array of 18 interlocking hexagonal mirrors, which span a total of 21 feet 4 inches, are peering far deeper into the universe and deeper into the universe’s past than any tool to date.

“We are going to be able to answer questions that we don’t even know what the questions are yet,” Nelson said.

Strange New Worlds

U.S. President Joe Biden unveiled the first image from the $10 billion James Webb Space Telescope Monday. It shows galaxy cluster SMACS 0723 as it appeared 4.6 billion years ago. The combined mass of this galaxy cluster acts as a gravitational lens, magnifying much more distant galaxies behind it.

The telescope won’t just see back further in time than any scientific instrument — almost to the beginning of the universe — it may also help us see if planets outside our solar system are habitable, Nelson said.

Morpheus — which played a key role in helping scientists understand images taken on NASA’s Hubble Space Telescope — will help scientists ask, and answer, these questions.

Working with Ryan Hausen, a Ph.D. student in UC Santa Cruz’s computer science department, UC Santa Cruz Astronomy and Astrophysics Professor Brant Robertson helped create a deep learning framework that classifies astronomical objects, such as galaxies, based on the raw data streaming out of telescopes on a pixel-by-pixel basis.

“The JWST will really enable us to see the universe in a new way that we’ve never seen before,” said Robertson said. “So it’s really exciting.”

Eventually, Morpheus will also be using the images to learn, too. Not only are the JWST’s optics unique, but JWST will also be collecting light galaxies that are further away — and thus redder — than were visible on the Hubble.

Morpheus is trained on UC Santa Cruz’s Lux supercomputer. The machine includes 28 GPU nodes with two NVIDIA V100 GPUs each.

In other words, while we’ll all feasting our eyes on these images for years to come, scientists will be feeding data from the JWST to AI.

Tune in: NASA and its partners will release the full series of Webb’s first full-color images and data, known as spectra, Tuesday, July 12, during a live NASA TV broadcast.

The AI Podcast · Astrophysicist Brant Robertson Using AI to Glean Insights from James Webb Space Telescope – Ep. 171

The post AI on the Sky: Stunning New Images from the James Webb Space Telescope to be Analyzed by, Train, AI appeared first on NVIDIA Blog.

Windfall: Omniverse Accelerates Turning Wind Power Into Clean Hydrogen Fuel

Engineers are using the NVIDIA Omniverse 3D simulation platform as part of a proof of concept that promises to become a model for putting green energy to work around the world.

Dubbed Gigastack, the pilot project — led by a consortium that includes Phillips 66 and Denmark-based renewable energy company Ørsted — will create low-emission fuel for the energy company’s Humber refinery in England.

Hydrogen is expected to play a critical role as the world moves to reduce its dependence on fossil fuels over the coming years. The market for hydrogen fuel is predicted to grow over 45x to $90 billion by 2030, up from $1.8 billion today.

The Gigastack project aims to showcase how green energy can be woven into complex, industrial energy infrastructure on a massive scale and accelerate net-zero emissions progress.

To make that happen, new kinds of collaboration are vital, explained Ahsan Yousufzai, global head of business development for energy surface at NVIDIA, during a conversation about the project in an on-demand panel discussion at NVIDIA GTC.

“To meet global sustainability targets, the entire energy ecosystem needs to work together,” Yousufzai said. “For that, technologies like AI and digital twins will play a major role.”

The system — now in the planning stages — will draw power from Ørsted’s massive Hornsea 1,218-megawatt offshore wind farm, the largest in the world upon its completion in January last year.

Hornsea will be connected to ITM Power’s Gigastack electrolyzer facility, which will use electrolysis to turn water into clean, renewable hydrogen fuel.

That fuel, in turn, will be put to work at Phillips 66’s Humber refinery, decarbonizing one of the U.K.’s largest industrial facilities.

The project is unique because of its scale — with plans to eventually ramp up Gigastack into a massive 1-gigawatt electrolyzer system — and because of its potential to become a blueprint for deploying electrolyzer technology for wider decarbonization.

Weaving all these elements together, however, requires tight collaboration between team members from Element Energy, ITM Power, Ørsted, Phillips 66 and Worley.

Worley — one of the largest global providers of engineering and professional services to the oil, gas, mining, power and infrastructure industries — turned to Aspen Technology’s Aspen OptiPlant, sophisticated software that’s a workhouse for planning and optimizing some of the world’s most complex infrastructure.

“When you have a finite amount of money to be spent, you want to maximize the number of options on how facilities can be designed, fabricated and constructed,” explained Vishal Mehta, senior vice president at Worley.

“This is the importance of rapid optioneering, where you’re able to run AI models and engines with not only mathematics but also visual representation,” Mehta said. “People can come up with ideas and, in real time, move them around with mathematical equations changing in the background.”

Worley relied on AspenTech’s OptiPlant to develop a 3D conceptual layout of the Gigastack green hydrogen project. The industrial optimization software combines decades of process modeling expertise with cutting-edge AI and machine learning.

The next step: connecting OptiPlant’s sophisticated physics-based plant piping and layout capabilities to build a 3D conceptual layout of the plant with Omniverse, potentially allowing teams to work together on plant design in real-time — connecting their various 3D software tools, datasets and teams together.

“With a traditional model review, it’s one person leading the way, but here we have this opportunity for everybody to be immersed in the facility,” said Sonali Singh, vice president of product management for performance engineering at AspenTech. “They can really all collaborate by looking at their individual priorities.”

Omniverse can be the platform on which they further build their digital twin of the growing facility, enabling connection of simulation data and AIs, capturing knowledge from human and AI collaborators working on the project and bringing intelligent optimization.

To learn more, watch the on-demand GTC session and explore the Gigastack project.

Find out how Siemens Gamesa and Zenotech are accelerating offshore wind farm simulations with NVIDIA’s full-stack technologies.

The post Windfall: Omniverse Accelerates Turning Wind Power Into Clean Hydrogen Fuel appeared first on NVIDIA Blog.

No Fueling Around: Designers Collaborate in Extended Reality on Porsche Electric Race Car

A one-of-a-kind electric race car revved to life before it was manufactured — or even prototyped — thanks to GPU-powered extended reality technology.

At the Automotive Innovation Forum in May, NVIDIA worked with Autodesk VRED to showcase a photorealistic Porsche electric sports car in augmented reality, with multiple attendees collaborating in the same immersive environment.

The demo delivered a life-size digital twin of the Porsche Mission R in AR and VR, which are collectively known as extended reality, or XR. Using NVIDIA CloudXR, Varjo XR-3 headsets and Lenovo Android tablets, audiences saw the virtual Porsche with photorealistic lighting and shadows.

All images courtesy of Autodesk.

Audiences could view the virtual race car side by side with a physical car on site. With this direct comparison, they witnessed the photorealistic nature of the AR model — from the color of the metals, to the surface of the tires, to the environmental lighting.

The stunning demo, which was shown through an Autodesk VRED collaborative session, ran on NVIDIA RTX-based virtual workstations.

There were two ways to view the demo. First, NVIDIA CloudXR streamed the experience to the tablets from a virtualized NVIDIA Project Aurora server, which was powered by NVIDIA A40 GPUs on a Lenovo ThinkStation SR670 Server. Attendees could also use Varjo headsets, which were locally tethered to NVIDIA RTX A6000 GPUs running on a Lenovo ThinkStation P620 workstation.

Powerful XR Technologies Behind the Streams

Up to five users at a time entered the scene, with two users wearing headsets to see the Porsche car in mixed reality, and three users on tablets to view the car in AR. Users were represented as avatars in the session.

With NVIDIA CloudXR, the forum attendees remotely streamed the photorealistic Porsche model. Built on NVIDIA RTX technology, CloudXR extends NVIDIA RTX Virtual Workstation software, which enables users to stream fully accelerated immersive graphics from a virtualized environment.

This demo used a virtualized Lenovo ThinkStation SR670 server to power NVIDIA’s Project Aurora — a software and hardware platform for XR streaming at the edge. Project Aurora delivers the horsepower of NVIDIA RTX A40 GPUs, so users could experience the rich, real-time graphics of the Porsche model from a machine room over a private 5G network.

Through server-based streaming with Project Aurora, multiple users from different locations were brought together to experience the demo in a single immersive environment. With the help of U.K.-based integrator The Grid Factory, Project Aurora is now available to be deployed in any enterprise.

Learn more about advanced XR streaming with NVIDIA CloudXR.

 

The post No Fueling Around: Designers Collaborate in Extended Reality on Porsche Electric Race Car appeared first on NVIDIA Blog.

Mission-Driven: Takeaways From Our Corporate Responsibility Report

NVIDIA’s latest corporate responsibility report shares our efforts in empowering employees and putting to work our technologies for the benefit of humanity.

Amid ongoing global economic concerns and pandemic challenges, this year’s report highlights our ability to attract and retain talent that come here to do their life’s work while tackling some of the world’s greatest technology and societal challenges.

Taking Care of Our People 

NVIDIA scored the highest grade for workplaces, ranking No. 1 on Glassdoor’s Best Places to work list for large U.S. companies. Some 95% of employees indicated they’d recommend NVIDIA to a friend.

We make the health of our employees and their families a top priority. Our family leave policy allows U.S. employees 12 weeks of fully paid leave to care for family members. And we’ve selected eight days each year in which we shut down all but essential operations globally, so employees can unwind without having to return to a full inbox.

We’ve recently added surrogacy benefits and fertility education resources to our award-winning list of family-forming benefits, which include adoption support and a generous parental leave program of up to 22 weeks of fully paid leave.

And we worked with our LGBTQ+ colleagues to expand gender affirmation resources and support.

Supporting Communities

Last year we established the Ignite program to prepare students from underrepresented communities for NVIDIA summer internships. Sixty-five percent of these students are returning for our internship program, and we saw a 100% increase in applications for this summer’s Ignite program.

We supported professional organizations, including Black Women in AI, Women in Data and Women-ai, to increase access to AI education and technology.

We launched NVIDIA Emerging Chapters, a new program that enables developers in emerging regions to build and scale their AI, data science and graphics expertise through technology access, educational resources and co-marketing opportunities.

We announced a three-year partnership with the Boys & Girls Clubs of Western Pennsylvania to expand access to AI and robotics to students in communities traditionally underrepresented in tech. Core to this is an open-source curriculum that will make it easy for Boys & Girls Clubs nationwide to deliver AI education to their students.

Our employees remained committed to donating resources to those in need, with nearly 40% of them participating in the NVIDIA Foundation’s Inspire 365 efforts during fiscal year 2022. That brought the unique participation rate since the initiative’s start to 68%.

Despite in-person volunteering remaining paused due to COVID, NVIDIANs still logged more than 16,500 volunteer hours through individual and virtual efforts, up more than 76% from the previous fiscal year.

NVIDIANs also joined the company in contributing more than $22 million to charitable causes in the last fiscal year. And during the Ukraine crisis, employees and NVIDIA have donated more than $4.6 million to date for humanitarian relief.

Developing Climate Solutions 

NVIDIA GPUs are enabling progress in responding to the crisis of climate change. With recent advances in AI, modeling of weather forecasting can now be done 4-5 magnitudes faster than with traditional computing methods.

We plan to build Earth-2, an AI supercomputer that will create a digital twin of the Earth, enabling scientists to do ultra-high-resolution climate modeling and put tools into the hands of cities and nations to simulate the impact of mitigation and adaptation strategies.

Digital twins are also being used to predict costly maintenance at power plants and model new energy sources like fusion reactor design.

NVIDIA scientists along with leading institutions are using AI to model the most efficient way to capture greenhouse gasses in the atmosphere and lock them away underground.

Startups from the NVIDIA Inception program are jumping into the climate challenge as well. In Kenya, a company is using AI to monitor the health of bee colonies. And a German startup is monitoring the ocean floor to help scientists understand how natural carbon sinks can be better utilized.

Building Energy-Efficient Technologies 

These solutions are not only bringing innovation to the climate challenge, but are built on a foundation of energy-efficient technology.

We aim to make every new generation of our GPUs faster and more energy efficient than its predecessor. As AI models and HPC applications increase exponentially in size, moving to new-generation GPUs will help our customers complete their work with lower energy consumption and get results more quickly.

NVIDIA GPUs are typically 20x more energy efficient for AI and HPC workloads than CPUs. If we switched all the CPU-only servers running AI and HPC worldwide to GPU-accelerated systems, the world could save nearly 12 trillion watt-hours of energy a year, equivalent to the electricity requirements of nearly 1.7 million U.S. homes.

Leaning Into Trustworthy AI

We’re committed to the advancement of trustworthy AI, recognizing that technology can have a profound impact on people and the world. We’ve set priorities that are rooted in fostering positive change and enabling trust and transparency in AI development.

We’re developing practices and methodologies enabling construction of AI products that are trustworthy by design, including datasets, machine learning tools and processes, AI model development, and software development and testing.

Running a Mission-Driven Company

As NVIDIA CEO Jensen Huang mentions in the opening letter of our corporate responsibility report, creating a place where people can do impactful work means building a culture strong enough to be willing to take on the most pressing problems.

The impacts of accelerated computing, which we have driven over the last two decades, are already being felt in areas as wide ranging as self-driving cars, healthcare and, increasingly, in climate change. We’re proud to have built this organization with more than 20,000 of the brightest minds and look forward to what they choose to tackle next.

The post Mission-Driven: Takeaways From Our Corporate Responsibility Report appeared first on NVIDIA Blog.

GFN Thursday Brings New Games to GeForce NOW for the Perfect Summer Playlist

Nothing beats the summer heat like GFN Thursday. Get ready for four new titles streaming at GeForce quality across nearly any device.

Buckle up for some great gaming, whether poolside, in the car for a long road trip, or in the air-conditioned comfort of home.

Speaking of summer, it’s also last call for this year’s Steam Summer Sale. Check out the special row in the GeForce NOW app for some great gaming deals before the sale ends today at 10am PDT.

Choose Your Adventure

With more than 1,300 games in the GeForce NOW library, there’s something for everyone. Single-player adventures? Check. Multiplayer battles? Got that, too. GFN Thursday brings more games each week, and it’s nearly impossible to play them all.

Catch up on titles you’ve been eyeing and put together a gaming playlist that fits the perfect summer mood. From blockbuster free-to-play action role-playing games like Genshin Impact and Lost Ark to story-driven sagas like Life is Strange: True Colors, high-speed action in NASCAR 21: Ignition and more, there are plenty of options to keep gamers busy.

There’s something for everyone on GeForce NOW.

Find your next adventure in the native GeForce NOW app or on play.geforcenow.com. Search for a game or genre using the top bar to build out the perfect gaming library. Streaming the game from GeForce-powered servers enables gamers to keep the action going, even on a Mac, mobile device, Chromebook and more.

Even better: RTX 3080 members can play at up to 4K resolution and 60 frames per second on PC and Mac, or take the action to the living room on the recently updated SHIELD TV. They can also take on opponents with ultra-low latency for the best gaming sessions, and RTX ON for supported titles to get the most cinematic visuals.

Press Play

Stand with the squad on the front lines in “Arma Reforger.”

Not sure where to start? Check out this week’s new additions to squad up in Arma Reforger, bring home the trophy in Matchpoint – Tennis Championships and more.

Here’s what’s coming to GeForce NOW this week:

Matchpoint – Tennis Championships (New release on Steam July 7)
Starship Troopers – Terran Command (New release on Epic Games Store July 7)
Sword and Fairy Inn 2 (New release on Steam, July 8)
Arma Reforger (Steam)

It was also announced that rFactor 2 would be coming to GeForce NOW. At this time, the title will not be coming to the service.

Finally, speaking of your summer playlist, we have a question that may get you a bit nostalgic. Let us know your answer on Twitter or in the comments below.

If you could replay any game as if it were the first time, which game would it be?

— NVIDIA GeForce NOW (@NVIDIAGFN) July 6, 2022

The post GFN Thursday Brings New Games to GeForce NOW for the Perfect Summer Playlist appeared first on NVIDIA Blog.