Aktuelle News & Schlagzeilen
Avolites media server selected for new lighting installation at Rio Hotel & Casino Las Vegas
An Avolites Ai media server is at the heart of the new lighting installation that wraps both towers at the Rio Hotel & Casino in Las Vegas. This is being used to map, help control and schedule over three miles - and 351,032 pixels - of “illuminative possibility”, designed by the creative lighting team of Chris Kuroda and Andrew “Gif” Giffin, using Clear LED’s X-Bar 25 mm product which wraps 360 degrees around the buildings.
Kuroda and Giffin programmed a series of cues, scenes and sequences that run automatically, bringing an organically engineered lighting aesthetic to the architecture of this iconic Vegas hotel and casino. Ruben Laine from Australia and US-based Creative Integration Studio was asked to devise a control solution that treated video as lighting.
This involved outputting lighting in a video-centric format, enabling micro-manageable levels of detail to be accessed for each vertical LED strip, with some over 4,000 pixels long. The Rio’s lighting scheme is part of an ongoing multi-million-dollar refit to the resort being managed by Dreamscape Companies. The new LEDs replace 3.6 miles of old neon that had been in residence since the 1990s.
The overall project is the brainchild of Marty Millman, VP of Development and Construction at Dreamscape. He didn’t want new lighting that resembled any other generic or clinically pixel mapped building installation fed with video content - he wanted something unique, different and stand-out. A major Phish fan for many years, Millman reached out to the artist’s long-term lighting creative team of Kuroda and Giffin, challenging them to produce the specific look he envisioned for The Rio, having been inspired by their lighting designs for the band. Their work for the artist frequently uses linear stage/theatre style light sources - like Robe Tetra2s and TetraXs - as a dynamic structural base to their familiar rig of automated trusses, simultaneously adding another layer of kinetic movement.
Kuroda and Giffin have programmed hundreds of thousands of lighting cues for the assorted Phish tours and projects, using lighting consoles and effects engines, which give the animation a special crisp. This was exactly what Millman wanted, and a workflow that is second nature to Kuroda and Giffin, who were delighted to take on the mission but quickly realized that the enormous number of pixels involved meant that DMX driven directly from a lighting console was not an option.
Ruben Laine immediately grasped that they needed “video playback” (that did not involve video content). Using the Avolites media server and Ai was one of Laine’s first thoughts. He specified this product for the task, in combination with the real-time graphics rendering of Notch. Laine, who has used the Avo AI media servers for over ten years, collaborated with the Avolites team in the UK to add a new function to the AI server’s “Follow-on” actions that allows for “randomized specificity” as a custom play mode to manage all the media, control and scheduling using a Notch block that Laine built, giving lighting control across the entire surface of the buildings.
This custom scheduling - allowing randomization - enables the playback of a long “base look” followed by a series of random sequences before returning to the base look again, and repeating the process, which also means that the same series of sequences will never get repeated and become predictable. The programmed lighting scenes are divided into two categories - “base looks” that are subtly animated, and “shows” that are faster, bolder, and higher contrast. A “base look” plays for five minutes, followed by a one-minute show - all randomly selected - followed again by another randomly selected base look, then another one-minute show.
The lighting programming itself was more loosely timed on a clip-by-clip basis with no two clips the same length, so using tools like Calendar or Macro Script made it impossible to use anything else. They started lighting programming with the linear elements in Notch, treating each vertical line as its own layer or canvas, complete with dedicated intensity controls and a “form” to allow for solids, gradients, or patterns, plus full transform controls like position and scale, as well as different color and alpha controls. This meant that a single layer could maneuver complex gradients using one element, and these layers were then stacked.
A second independently controlled layer allowed Giffin to get “really funky” with lighting programming, stacking two-dimensional controls, giving a set of twenty “super layers” to cover the entire array of layers, rendering underneath the 200 linear layers with similar but more complex controls and effects. Finally, by including animatable masks, the individual architectural segments and features of the buildings could be highlighted, which maintained Rio’s architectural identity.
“We wanted to achieve this without the building getting lost in the glamour and glitz of its shiny, new technicolor veil”, says Kuroda, adding that “the genius” of this control methodology “was that it allowed our familiar tools and lighting programming workflow to be used during the creative process”.
Ideas were discussed just like they were standard lighting cues, creating and manipulating them on the fly using a lighting console and lighting console logic, relying on many of their concert lighting tricks like color wipes across the whole canvas, narrow bands of white leading in a new color from “rocket tips”, or creating shapes with the negative space and animating them into numerous forms. With around fifty or sixty slow-moving looks and another fifty or sixty fast-moving ones, they needed a server that would pick these to play randomly over the course of a year, so that nothing was repeated regularly.
This Notch and Q Series/Ai combination also crunches 2,000 universes of pixel data into eight DMX universes of externally exposed ArtNet channels. Each sequence is played back from the console and ArtNet, recorded into Notch, then rendered at sixty frames per second for the smoothest possible motion across each pixel on The Rio’s facade.
The Q Series media server outputs the rendered clips into Clear LED’s signal processors which are then pushed down a few miles of fiber optic cable. “Q Series/Ai was without a doubt a crucial part of this adventure - from our original concept of running the show as live Notch blocks, through every creative, technical, and executive challenge, to the final execution”, says Laine. “Using Q Series/Ai allowed us to effectively map the building in just a couple of hours.”
(Photos: Creative Integration Studio/Eliska Sky/Ultra Vegas Drone Services)
SCHLAGZEILEN
news archiv
suche
© 1999 - 2025 Entertainment Technology Press Limited News Stories












