Rock The House Media System Build

IMG_0720-X2.jpg

Rock The House Media System Build

During the summer of 2018 I was tasked with designing a media system that would go on tour with Rock The House Entertainment Company and subsequently become their flagship entertainment media system. From the get go we were pretty sure that a custom solution would be the answer. Rock The House handles projects of all shapes and sizes so, with that in mind I knew there wasn’t any one platform that would fit the use cases needed, so I would be building from scratch.

                Going into the project Rock the House had a list of items that were must haves for the servers. They wanted full redundancy in the servers, redundant hard drives, redundant PSU’s, ECC ram and overall 2 servers were going to be built that way in pairs there would be full redundancy, more on how this was achieved later. Furthermore, they were looking for the ability to utilize PowerPoint and keynote presentations, video, audio and a multitude of other media formats. The system needed some form of screen management which included the ability to blend projectors, and video walls. With these things in mind I started looking at all the main stream platforms such as Wings, PVP3, D3, Watchout and others to find out my best software and hardware approaches. In the beginning my focus was on finding what software platform(s) we would use. I spent time demoing all the software’s above as well as putting Resolume to the test as far as working from a corporate standpoint. After a considerable amount of time demoing all the software platforms I presented the pros and cons of each and in the end the team decided that we would proceed with two platforms, Pixera (the successor to Wings) as well as Resolume. The goal was that Resolume would be used for Media Busking Applications as well as some corporate gigs, but Pixera would pick up most of the corporate gigs. However, knowing that Pixera wouldn’t be ready for the tour that the servers were being built for, it was time to outfit Resolume into a corporate media server.

                With this task in mind, I was ready to build the first prototype of the server. The purpose of this was to have a machine running Resolume with similar hardware specifications that we could run a proof of concept on. I would be able to show off some of the major benefits to running a media server versus just running multiple laptops into a video switcher, which had been the chosen method for Rock The House in the past. Using parts from my personal collection I setup a prototype using Resolume and was able to show off all of the features.  

                After about 2 days of show and tell, I had the green light to proceed with designing the custom servers. I again dove into a world of research looking at all the big companies, speaking with industry professionals in hopes to figure out what the best combination of hardware would be for the servers. After about a week of research I finally had a list of hardware. The servers would run on Asus Motherboards with no integrated graphics, 64GB of ECC Ram, an Intel Xeon E5 Processor, redundant power supplies, two 256GB solid state hard drives running in RAID 1 for the operating system and software, and 2 More 1TB SSD’s also in RAID 1 for the content. Finally, the most important things, Nvidia Quadro P6000 Graphics Cards, AJA KONA HDMI capture cards and AJA KONA SDI capture cards.

                The next step was the actual build. Overall this part was pretty easy since I have been building computers for a long time. The space was a little bit tight because the graphics cards were huge, and I was using 4U rack mount server chassis. However, after some time the builds came together.

                Anyone who has ever built a media system knows that building the servers is the easy part, its building the system that’s truly challenging. Building the actual media system was a true challenge. I had to balance efficiency and complexity, among all the normal challenges.

Now let’s take a look at the system, starting with a macro overview, on one end there was a Roland V800MKII with 3 PTZ Cameras and 2 manned Cameras. The mixer then sent two separate outputs into two splitters which then went out to the servers. Each server got identical main and aux feeds from the Rolland mixer. This allowed us to use two separate camera feeds at any time in the composition and allowed for full redundancy since both feeds were constantly sending to both servers. Also coming into both servers were 3 Laptops, one for power point, one for power point notes, and a third as a backup should either of the other two fail. Then, on the server level, we were processing everything through Resolume adding overlays, mapping etc. and spitting everything back out onto 7 Targets. The seven targets included, 2 Laser Projectors, an ultrawide LED wall, an LED panel Arch, 2 DSM screens and a previz, model screen. The two laser projectors were used as wing screens flanking the stage, both could be individually controlled to allow sperate camera shots and other custom content to be used on them. Both sets of LED wall had custom content running on them throughout the show. The two DSM screens were also separately controlled to allow a variety of different content, anything from notes to live views of content for the speaker to keep track of what’s going on behind them without looking over their shoulder. Finally, the previz monitor was used to give a 2D representation of the targets to the operator. This was helpful because video front of house was often behind the stage, in another room, or hidden behind a curtain. To reach all 7 targets was a little bit challenging which is where the routing and distribution rack came into play. Out of each server we had 4, 4K outputs. The first output on the servers was the operator screens, the other three outputs were running into the routing and distribution rack. All of these would go into a video matrix. The video matrix allowed us to choose which inputs were being sent to each of our 7 outputs. This allowed us to seamlessly switch between the main and backup servers simply by rerouting from one server to another without having to unplug and replug a bunch of cables. Of the three 4K inputs form each server, one would get routed directly to a 4K LED processor to feed the massive arch structure and the remaining two feeds form each server were routed to Datapath FX4 units. The point of these units was to take the 4K outputs and split them down into 4 separate 1080P outputs. These 8 1080P outputs were powering the wing screens, the DSM screens, the Ultrawide wall and the previz screen, as well as leaving extra pixel space should we decide to add more outputs for backstage monitors ect. After going through the FX4 units all 8 of the split outputs were sent back into the matrix to be routed out to their perspective targets. All of this is outlined in the system map provided in the pictures.

                Now that the physical part of the system was all worked out it was time to move to the internal part of the system. First things first, getting the Rolland and Resolume to talk nicely as well as getting all the HDMI capture sources into Resolume. After fiddling around with settings, I was able to get this running smoothly. Next step was getting all the outputs talking happily. Sending the correct custom resolutions where needed and making sure everything was reading proper refresh rates. After a long battle with EDID and the matrix, the system was finally humming along. The final step was getting the main and redundant servers in sync. Using an Rsync algorithm I was able to sync the needed file systems between the two servers so any changes within Resolume would be the same on both servers. The next thing was syncing all the live actions in Resolume. To complete this task, I turned to an OSC framework. Using OSC I was able to replicate every button click from the master machine to the slave machine. Dropping down one more level into the video mapping. Ensuring the outputs that were running to the FX4 units were mapped into 4 sperate 1080P sections and then everything else in the composition matched the custom resolutions needed. And while by word count this paragraph is probably the shortest one, completing these tasks was by far the most time consuming and most difficult as it took a lot of tinkering and testing to get everything just right.  

                So now the most useful paragraph for anyone looking to pursue a similar project. What would I have done differently? First and foremost, don’t underestimate the amount of time needed in the testing and configuration phase. Getting all this technology to happily talk with minimal lag and no dropped frames takes a lot of work and tinkering. To this day myself and the team are still optimizing and adjusting settings to make things better and better. Secondly, thinking of things in pixel spaces rather than resolutions would have allowed me to make this system a lot more efficient. For example, I could have run all of the LED screens on one processor thus cutting down the number of outputs needed form the servers. Finally creating restore points. When I originally had the servers setup I made the mistake of not creating a fresh restore point that would have allowed me to jump back in time in the case of something going crazy in the settings and really screwing things up. This came back to bite me in the end when others were using the servers and would adjust things that would, in the end, cause problems, I had no easy undo button. Overall the project was super educational. I had an amazing time building the system and look forward to doing more things like it in the future. Once the system comes home from tour, my hope is to work on implementing DMX control, so the servers can be controlled by a lighting console. I am also looking forward to making new configurations in which the servers will function alone. Then come spring implementing the Pixera platform into the mix. If you have any questions about the builds or need some advice on a build you’re currently working on feel free to reach out.