J.Lo’s It’s My Party tour starts with a bang. The curtain drops, revealing the artist dangled above the stage covered in Swarovski crystals. She’s perched inside a sparkling hoop that sits underneath a wine glass chandelier. There are hundreds of balloons, dozens of dancers, and a multistory video screen, all awash in various shades of violet and rose. It’s a bombastic spectacle — and every detail was designed by a small team of creatives called Silent House.
Working at Silent House puts Alex Reardon behind some of music’s most visual moments. As creative director, designer, and partner at the Los Angeles production group, the list of acts Silent House has worked with reads like a Billboard chart: Meghan Trainor, The Weeknd, Demi Lovato, Brockhampton, Katy Perry, Nicki Minaj, and scores more.
Reardon says his job is to figure out how to tap into the “nucleus” of an artist and represent it onstage. In more practical terms, that means he (and Silent House) operates as a one-stop shop for an artist’s performance or tour, capable of handling show design, production, creative direction, choreography, finances, logistical solutions, and more. Oftentimes, this is all done in just a few months. “What we do is architecture on speed,” Reardon laughs.
Even something as seemingly simple as the floor on which an artist stands has to be fully considered by Reardon and his team. For Tyler, the Creator’s blowout Grammys performance — which put Tyler in the middle of tilted set designed to look like a neighborhood street — Silent House had to find a coating that would fulfill a number of creative needs: the set was heavily angled up, so a grippy floor texture was needed as Tyler moved upstage, and the coating had to withstand the flames that would shoot up around him. A stack of metal tiles covered in pebble-like textures meant to test different floor coatings is still sitting in Reardon’s office when I visit after the Grammys.
We are all familiar with the end result of a show: the explosive, over-the-top moments packed with ever-shifting lifts, sets, videos, things that go boom, and splashy lights that amaze. But it can be easy to forget that, no matter how complicated, they all start with “a meeting and a blank piece of paper,” Reardon says. Off the heels of producing several performances at this year’s Grammys, along with Khalid’s Free Spirit tour, Reardon and I chat about what it takes to make some of music’s biggest shows come alive.
This interview has been lightly edited for clarity.
How long does it take to get from an idea to a show being performed?
We started working on the Grammys in November, loosely. But in general, it’s between four and six months.
That’s not very long.
No, it’s not. We go from inspiration to the team churning out renders to getting modified by budget to remodeling to re-rendering over and over again until we get to “They like it, yay!” Then we build it.
What we do is architecture on speed. But it’s our baseline speed of operating. So it’s not that scary if you do it all the time. If you took someone out of a normal job and just dropped them in, I think the process would chew them up.
You did Tyler, the Creator’s recent Grammys performance and his IGOR tour. During a recent NAMM panel, you said that designing for Tyler is creatively challenging. Why’s that?
You can put Tyler on a blank stage with the house lights on, and he’d still be the best thing you’ve ever seen. How do you frame a guy like that? How do you create something that doesn’t try and compete with him?
The point is to design something where the only thing you’re staring at is him, and the rest augments it. So the hair idea [for the IGOR tour] visually made sense. It’s just appropriate. Here’s a guy wearing a blond wig, and it’s swishing about a bit. Why don’t we make the stage out of it? It wasn’t even expensive. I’d first gone down the route of using big thick cables, and Sew What? said, “Actually, we’ve got another idea. How about if we paint a base material in a good gray that can take projection well, slice it into 3/4-inch pieces, then throw grommets and ties on top, and you’re done?”
But it was just a playground. It was a vehicle for his performance.
We have a wonderfully collaborative process. Tyler is very specific about color. For the Grammys performance, I sent him Pantones to choose from. He gets that specific. He’s really, really involved in his visuals.
How many artists want to be that involved?
Fifteen or 20 percent. They all have a say. They always give you notes. But Tyler is fastidious. And he’s a proper eccentric, which I like because I get on with eccentrics.
When someone hires Silent House, how does the creative process start?
With ears. I listen. It’s unbelievably important to try and get that nucleus. What is it that the artist is trying to get to? I’m here to find out what is it about your show that you want your audience to walk away with. Don’t come to me with a laundry list and say, “I want the big screens, and I want the pyro” because then we end up designing something that looks like everyone else’s show. But if I listen to your emotions, we can design something that will work beautifully on Instagram, and it’ll go around the world.
So the process starts with an initial meeting. I used to like to submit designs on spec without meeting people. I don’t do that anymore because it increases the opportunity of failure.
We’re designers, not artists. We have to be creative to a specification. Always ask for what they need to give them what they want. And finding out what that is means a sitdown with the person that is in charge of everything.
Is that always the artist?
Yeah, the artist. And some have a creative director they work with for styling their vision. Oftentimes, it can be beneficial to have an intermediary between you and the artist.
Because they have a history with the artist. They already know what the artist likes or doesn’t like, and so they can help very quickly in a collaborative way, like, “Oh shit, I forgot to say they hate blue” or whatever it is. That’s very beneficial.
Above, a typical creative meeting with Reardon and J.Lo discussing tour ideas.
To answer your question, it starts with a meeting and a blank piece of paper. I’ll do some research. I look at their promo videos and listen to some of the music if I can. Sometimes it’s a challenge to design a show for an artist whose music you don’t like very much. [Laughs]
How have you seen stage production change over the years?
Well, when I started, it was a couple of rises, a flat stage, and a few lights. As soon as someone moved the decimal point in the cost of a ticket, production value went up, and technology went up as a result. Now, the bar keeps being pushed up and up and up. And because of social media, everyone is seeing everyone else’s shows.
Are you always designing with Instagram in mind?
Entirely. Many years ago, I remember talking to Marc Brickman, who is still one of the best lighting designers ever and worked with Pink Floyd. He said that back then, he designed for the person with the worst seat in the house. I remembered that because it’s very easy to dismiss the fans. I don’t. I see those people as paying a contribution to my mortgage. Thanks for coming.
From there, we moved to magnification where every single person onstage was filmed and shown on giant screens. That means I’m designing a show for the person with the worst seat in the house but also for the camera.
So when I go to a festival like Lollapalooza, and I see those big side screens on either side of a stage, you’re thinking of lighting for those feeds.
Yes. You have to know what you’re doing when it comes to color temperature, the color of white light, and how the illumination on someone’s face is balanced because every part of the show is being relayed super close up. And if you’re dealing with any musician with an ego — and I believe a few of them have an ego — you have to make sure you toe a very cosmetic line. As an example, anyone over a certain age, you have to light very carefully. Depending on the skin tone, you can balance between around 4,200 to 5,600 Kelvin. It’s a fairly narrow band that would be acceptable to light someone’s face.
What do you mean by “toe a cosmetic line”?
Most modern light sources burn a very blue version of what we see as “white.” They also have an almost imperceptible green cast to them. However, on camera, that green shows up. So, to make sure that our clients don’t end up looking like the cast of The Walking Dead, we have to know what combination of gels to use to achieve the most cosmetic version of white light. We can slide around a little in the color spectrum to create a cool white that is still flattering, and obviously, the closer we get to the warm white of a candle, the more “romantic” the illumination.
Before magnification, was it just “light them up as much as possible”?
Pretty much. Back then, we also didn’t have the possibility to go that bright. The fixtures we were using weren’t capable of producing the same amount of light.
So, we went from lighting it so the person in the back can see, then for magnification, and now for Instagram, which means assuming that everyone is taking photographs all the time. Every single moment has to be thought out. Even so, I’m always trying to design based on what Marc told me, for the person that can’t see quite right.
What is Kelvin, and why does it matter for skin tone?
Kelvin is used in lighting to measure color temperature or the hue of a light source. This is separate from the brightness of a light, which is measured in lumens. Lower Kelvin values equate to amber-toned hues, while higher Kelvin values are cooler and bluish.
The range Reardon prefers to light artists with falls in what would be considered a neutral range or most akin to daylight. Some lights that are in this range appear neutral but have a green undertone that is amplified on camera. Green tones can give skin an unhealthy-looking pallor, so Reardon uses gels to color-correct and get a more accurate white that is bright and crisp (and flattering).
What does it mean to design something for Instagram?
You’re thinking about the whole. And you’ve got to make sure there are certain impactful apex moments where things exactly hit.
It’s not so much that the look changes. It’s more that a show is sculpted for moments. An apex moment could be a silhouette or a strobe for chaos or someone on a plinth. It’s not about the paragraphs but the punctuation throughout a show.
I guess I thought the answer would be more over-the-top and involve spectacle.
Yes, there are over-the-top bits, like when all the pyro goes off. But that’s just another punctuation point. A punctuation point could also be someone falling off something into a crash mat. It doesn’t have to be big.
One of my favorite shows you’ve designed was the giant head for Avicii’s Levels tour.
With that show, I had already presented five different designs, and everyone agreed on number five. But about two days later, I remember thinking as a raver of old: “What made house music so specifically appealing to me?” It was so positive. And then I thought, “What is house music?” It’s about the everyman. So what represents everyman? Human form. Organic form. That’s where the head as a stage idea came from.
This was 2012, and projection mapping was still quite new. I got together with a bunch of very, very clever people, and we managed to work out how to make things look as though they were floating in front of the head. We filmed an African American woman for the start of the show for the song that starts with “Oh sometimes, I get a good feeling.” I got quite specific and said, “Find someone who’s got 1970s teeth.” Then we blended out everything else, and all we projected was the mouth onto the lips of the head. So many audience members actually thought it was an animatronic! It was interesting to mess with people’s expectations.
One of the things I love about projection mapping is that you can make textures and materials do in the virtual what they can’t physically do in the real world. We made it look as though the whole head was made of metal. And then we made the metal melt off the face. We made it look as though a mirrored cube was floating out in front of the head and rotating. Then it became a neon Rubik’s Cube and shattered. The spec back in those days was to just fuck with the ravers. How many more strobes? How many things can you do to pop their brains?
Are there any tricks to making the projections look so real?
There’s a product called Screen Goo, which may be the worst name of any product ever. It’s a two-step paint process that’s very goopy and terrible. But that is the best product for painting hard surfaces for projection.
Why doesn’t projection mapping work on just any surface?
Let’s say you decide to have a movie night at home in your garden, and you hang a white sheet and put a projector on it. Anything that comes out of the projector that is black will look gray. Your color resolution is lost. You’ve got an idea of what you’re looking at, but it doesn’t look as good as it would on a TV.
A flat material absorbs light, a shiny one will reflect the lens, and so you have to find a balance in between matte and glossy. Screen Goo is highly pigmented and has a little silvering in the paint that reflects light back at lots of different angles. It improves color rendition (accuracy).
Since Avicii’s a DJ, how did you deal with not knowing what song would be played next?
We designed some software whereby an iPad wirelessly synced to four Mac minis, which were under the four decks in the booth. We knew there were about 60 tracks he was likely to choose from. As soon as he picked a track to load it to a deck, the track’s visuals would also load. We were taking the clock from each individual track and having that trigger lighting and video content. So, for example, when he changed the playback speed of a track, the strobe lights would sync in real time. And I was watching it thinking, “It fucking worked!” I miss the little fella.
Do you ever design shows where certain lighting and visual components are done on the fly?
Ninety-nine percent are locked to timecode. And the reason for that is because if you are using timecode, you can do more. When I started, I would run a lighting console like a piano, playing every little cue. But now, if you really want to get the most nuance out of what you design, you give it to the clock.
What is timecode, and how is it used for live shows?
Big shows need an efficient way to synchronize all of their moving parts down to the second. Yes, there are humans making sure things run smoothly, but when fireworks, video displays, and dramatic light sweeps all happen at the exact same time, that’s because of preprogrammed timecode.
Timecode is a timing signal assigned to every piece of visual and audio media that will be used during a performance. It’s generally represented by a string of four numbers displayed as 00:00:00:00 (hour:minute:second:frame). With audio, the frame measurement is replaced by sample.
On a very basic level, think about watching a movie. The audio is synced with the video so you have the same experience every time you watch it. Now expand that idea out to a concert where hundreds of lights, LED panels, and live effects must all talk to each other in real time and be fully in sync with the music.
Timecode can be a trigger for lighting cues, displaying video, setting audio levels, and executing more complex operations.
Technologically, what we did on Avicii’s tour was fascinating to me. My background is as a classically trained musician, and my dad’s an architect. So to me, anything to do with light and production has to be very specific. It has to be musical. I see so many designs and so many shows that look very much as though they were designed by a technician that got let loose. When people ask me things like, “How many lights have you got in the rig?” I have no idea. It’s not relevant. It’s the wrong question.
During your panel at NAMM, another Silent House designer said your industry is very wasteful. How so?
Most tours are approximately 80 percent built with rental items that get repurposed, and that’s wonderful, but 20 percent are custom pieces that will never have another use and go to landfill.
In the old days, they built huge sets that were stored for a while and then scrapped. But we’re not really generating the amount of landfill waste that we used to, mainly because a lot of sets are built with video panels. If your scenery is digital, then that little panel of video that got hooked into one person’s array gets popped into someone else’s array next week.
So that’s great, and they’re LED so they don’t use that much power. There are also new moving lights that we used on the Jonas Brothers and on Tyler that are LED and they’re phenomenally bright. I think over the next two years, we’re going to have a noticeably decreased carbon footprint.
What we use a lot of now is power and diesel fuel. There are a lot of trucks, a lot of transport, airplanes, tour buses, and hotel rooms.
What’s been the most impactful tech development for how you design shows?
Integration. There used to be various visual disciplines for a show: the screen content, the lights, the set.
Now, most things can do double duty or be all of those things at once. I can light a show just with video screens. We’re making sets out of video screens. We can integrate the movement of the video screens with the movement of the artist. The more this symbiosis emerges into one visual entity, the more flexibility I have to create different looks. So if I can now integrate LED screens as a source of illumination, we’ve now merged lighting and video.
Khalid’s Free Spirit tour by the numbers
1 — handheld microphone
3 — GrandMA consoles to control lighting
4 — network processing units to add DMX ports and expand how many parameters can be controlled by the consoles
291 — light fixtures
5 — connection nodes that convert incoming protocols to a lighting protocol called DMX
3 — timecode analyzers to detect abnormalities like repeated frames and jitter
4 — fog and haze generators
671 — LED tiles
1 — video switcher to change between different video or audio sources
6 — video projectors
4 — LED display controllers
3 — media servers
You can see that very clearly in Khalid’s Free Spirit tour.
That stage was about creating a little box that deploys this very fine wraparound string. When Khalid wandered in, he said it was like an art installation.
There were lots of projectors, but we also had LED screens on top and underneath Khalid. So he was entirely in the set, which was interesting and presented its own set of challenges.
During one rehearsal, we hit a cue, and he covered his face and let out this “Ahhhh!” We had no idea what he was reacting to. When we finished rehearsal, I went up on stage to check things out and thought, “Seems pretty good to me.” Then, suddenly, I get what the trouble is, and I’m yelling “Whoa, stop, stop, stop, stop!” The LED floor had these really bright bursts of white light, which looked nice at 150 feet away, but if you’re on top of it, are blinding. [Laughs] If I crank the power of an LED screen’s output to 100, it’s like the sun. We normally dull them down to approximately 20 percent of their output.
What emerging technology are you most excited about?
A software called Notch. Notch has the ability to generate live visual effects in real time, which I’m still convinced is some black magic because that involves enormously heavy amounts of rendering.
We can apply that to a live camera input from the stage. So graphics can respond based on how someone is moving on stage. With Tyler, the Creator, I used an algorithm to have him control the depth of the effect based on his velocity.
For other things, he wanted to appear as though he was in the rain. Normally, you’d just put another layer on the video of rain falling. But Notch can do real-time detection. So it can respond to a live video feed and have the droplets hit his nose and burst. Or his shoulders. Notch is such a broad brush. You can do so much with it.
I see a lot of interest in this area of how to make shows more dynamic and reactive to the performance.
Yes. And it’s about music, not technology. It is frustrating to see people so wrapped up in technology that doesn’t serve the music. I want the technology to reference the vibrations of a violinist. The vibrations of voices. We vibrate at a molecular level. The vibration of music is what dictates everything.
Are there other technologies right now that excite you like Notch?
AR. The idea of taking a negative, which is that everyone watches shows on their phone, and making it into a positive. But if you can’t kill it, use it.
Maybe if you watch on your phone in letterbox, you get one set of AR conditions, and another if you watch vertically. You could use AR to decorate a stage any way you want. Snakes could go around things, trees could grow, glass could melt, things could levitate. There’s all this magic you can do with physics out the window.
And there’s no reason why an artist couldn’t monetize that. It opens up sponsorship branding and merchandise opportunities. Imagine a virtual neon sign appears for an AR merch stand. You could purchase a T-shirt while in the crowd and have it delivered to your home before the show’s even started.
Where do you see the future of stage production going?
More integration with AR and perhaps even AI. It would be interesting to take some feed from the audience and use it to inform what’s happening onstage.
Photography by Dani Deahl / The Verge