Stella Artois Give Beautifully

Sean Pruen

Stella Artois: Give Beautifully

Play video

UNIT9 brought Stella Artois’ global Christmas campaign ‘Give Beautifully’ to life by designing an interactive light installation.

Made up of thousands of LED lights, our interactive pop-up space gifted the people of New York City and Buenos Aires with the magical experience of walking beneath the stars. When guests reached up they triggered the stars to descended down to their position.

At the opening of the event, singer John Legend performed live under the installation. We created bespoke movement and sequencing of the lights for his performance with several animations that could be changed in real time during the show and an extra layer that made the stars twinkle to the music.

Watch John Legend – Under The Stars

The installation was made up of clusters of interactive and kinetic stars in the centre, which became gradually less dense towards the edges. In order to capture the scale of the night sky, the canopy of stars stretched 20m x 20m, and we doubled the impact by reflecting it all onto a mirrored floor. This gave people the opportunity to feel like they were walking through an infinite amount of stars, based on the concept that Stella Artois was named after the Christmas star.

There were three types of stars in the installation: large kinetic globes, small static globes and a star cloth dotted with hundreds of small LEDs. Five of the kinetic globes were special. When these stars were reached for, the canopy pulsed in a brilliant flash of white, and a DSLR hidden above captured the gaze of the people reaching up from below.

“It was our aim from the beginning to create a light installation that delivered a feeling of wonder. That majestic sense of ‘wow’ you get when you look up at the milky way on a cloudless night, far from the noise of the city. A canopy of stars, rich in depth and organic natural beauty that you truly feel you can reach out and touch.”

Sean Pruen, Creative Director

Technology

The installation was written in C++ using openFrameworks. Our cameras were controlled by Raspberry Pi and Python scripts, which sent pictures of the guests to a remote drive. Custom software was built to operate all these aspects simultaneously. Animations could be easily created and tested in the software simulation and then instantly seen on the physical installation

To detect the visitor’s hands in the open space we used a motion sensor called RadarTouch. It creates a plane of infrared rays and can detect the position of any interference on that plane. We mounted two of them at about 2m from the floor to cover the entire room. Once a ‘touch’ was detected, the software would map it to the closest kinetic star and start bringing it down to the user’s hand.


Credits

Awards