BSkyB has opened what is believed to be the most environmentally friendly broadcast centre in Europe at its West London headquarters. The new building cost £250million, including infrastructure and technology, and ushers in a new way of working for Sky staff, writes David Fox.
The Harlequin One building was designed for people before technology, the reverse of how broadcast studios are normally planned. “We put people on the outside and we put technology on the inside. We also put it in areas that actually give greater access to those people,” says Darren Long (pictured), director of operations at Sky Sports.
Staff still had to change the way they worked, but their needs were prioritised, with the intention to deliver a building that was very environmentally friendly, highly functional and an excellent place to work.
“It is the most environmentally sustainable broadcast facility in Europe. As a building we were hoping we’d get a Class C BREEAM Rating, but we’ve actually ended up getting an A,” says Alistair Watters, Harlequin One programme director.
Environmental considerations weren’t just a token. “It was from top down,” says Long. At the time the building was commissioned, James Murdoch, then Sky CEO, said: “We’ve got to make a building that really is environmental. I want to know where every bit of that building has come from. I want to know why we put it in there. I want to know what the rationale behind it was. If somebody asks you ‘where did that piece of glass come from or that piece of ironwork come from?’, you’ve got to give a reason and the purpose of it.”
They also examined the current workflow and considered how to change it as they moved to file-based systems in order to deliver content faster and better to customers. Part of that is making sure staff have the right tool sets. Long was frustrated that he could go home and do more on his PC than he could do on his broadcast desktop.
“It shouldn’t be the case that somebody walks into this place having to log some material and that’s the only job they’re going to do that day. Why can’t they do that from home?” This saves fuel and time wasted commuting. Most of his sports staff work away from base and don’t want to have to return unnecessarily. “They should be able to do some of that job wherever they are. Whether it be on an outside broadcast or whether it be at home or wherever they are in our bureaus all around the world.”
However, this needs an infrastructure that can cope, and staff have to want to operate like that, which means consulting them and giving them a say in the planning, and giving them laptops so that they can work at home and having WiFi so that they don’t have to worry about where to connect. Therefore, in planning the building, “every single person has been consulted at an individual level,” he says.
He also wants to move away from different staff having their own restricted set of desktop tools. “Why shouldn’t you have access to the MAM so that you can see content. You might not be able to take it off but you can view it,” he says. “It’s those type of things that actually allow people to do their job more.”
Up on the roof
One sustainable design feature was taking all the big plant equipment typically found in a basement (such as uninterruptible power supplies) and putting it on the roof so that it can benefit from fresh air cooling. That also applies to the people as the building uses primarily natural ventilation. “People will have indications on their desktop and PC that it’s a good day to open the window and so if you want to open it because you want a bit more air coming in then please do. Here’s the button. Push it,” says Long.
It uses LED lighting to reduce power consumption and heat, including for studio lighting, which is a big part of the equation. Sky initially went for tungsten lighting (because LED at the time didn’t offer a full colour spectrum), but quite recently found an LED product which, although it costs significantly more, has a much better sustainable profile, and is installing that instead.
The architects modelled the building’s airflow and ventilation, with chimneys to vent each studio, which Watters hopes will mean that studios will rarely need air conditioning. The building has a total of 13 external chimneys and two central cores. Windows are also controlled by the building automation, so that if conditions are right, staff can open them (they can close automatically if conditions change). It also uses recycled rainwater for the toilets.
“The ability for teams to collaborate easily played a major part in the development of the building. Currently the teams are split over multiple areas on the site, but now with touches like a spiral staircase in the middle of the offices, teams can communicate with others with only a short walk,” he says.
“The spiral staircase wasn’t actually part of the original brief, but then we thought about teams on this floor wanting to communicate with the teams above, a walk down to the lifts, up and then across; actually it’s much more about actually being able just to quickly go up, communicate with who you needed to speak to and come back down again.” The building has very minimalist office space but many collaboration zones in the wide cross corridors.
Going tapeless
“One of the biggest changes that’s occurring as part of this building, but also throughout the rest of the campus, is tapeless. We are moving from a tape-based culture to a tapeless culture. That obviously does have some sustainability argument behind it, but also from a workflow perspective. We currently ship around 4,500 tapes around site a day,” says Watters. This will disappear and content will be available on the desktop.
Indeed, Harlequin One is an entirely tapeless building, a policy it had to decide on very early. “It does also mean that we have had to make some changes to be able to communicate with the rest of the campus, but even our ingest facilities sit outside Harlequin One because we’re not allowing tapes in: once they come in you can’t get rid of them,” he says.
“We have tried to future-proof everywhere possible, and so far we haven’t had to make any fundamental changes. We put a lot of thought into how we did this before we started, because with building it is expensive not to do that and to subsequently change what you’ve built. Mistakes have been made, but they are few and they’re relatively minor.”
Sky started moving people into the building a few months ago, starting with the Project Team itself, followed by the Tapeless Programme Team (as all of the broadcast teams that move in will have to go tapeless first). There is also the matter of matching moves with production schedules, so that sports staff can’t move until their sport is off the air, which can be difficult as this is when they would go on holiday.
Sky Sports News, for example, will go live during the summer, and other genres throughout the year until early next year. It will eventually house Sport, General Entertainment, Broadcast Ops and Broadcast Services.
Entertainment will mainly use Studios 4 and 5, which are big enough to hold audiences of 200 to 300 people. The two studios have a demountable wall that allows them to be used as one, but when in place it reduces sound by 65dB.
Architectural digest
In designing Harlequin One, Sky “went through a process of what we called ‘value engineering’, which was making sure we got the maximum value for our investment and we continued to do that with every change. But the fundamental principles have remained the same; technical details have changed, rooms have moved around simply as things have moved on, and in parallel to this what we also have to recognise is that Sky’s an organisation and in the past five years has fundamentally changed,” explains Watters. “We’ve got the addition of news channels, HD has become very prevalent and now 3D as well, so we’ve had to absorb them.”
“Everything is buried within the floor but equally accessible. We designed all the things like offices and infrastructure around runways so that we can actually run cables in a straight line,” says Long. This makes connectivity easier and ensures no degradation on circuits.
There is more than three million metres of cable (fibre and Cat6) in the building, with particularly large underfloor areas (800mm high, the maximum permitted) to run all the cabling and allow easy access for its engineers, and those from systems integrator ATG Broadcast, which did the installation.
The building is five stories high, including basement, and about 100 metres long by 55 metres wide, but with its high ceilings and the underfloor areas, Watters says: “If this was a normal office building, it would be nine stories.”
Make, shape and share
The building houses eight studios, post-production, audio dubbing, control rooms for Sky’s interactive service, Champions League, Football First, voiceover booths, network media and transmission. “The way it’s stacked is: make, shape and share,” says Long. Make (the studios) at the bottom. Shape (post production and broadcast operations) on the two central floors, and share (distribution and transmission) at the top, where there are control rooms dedicated to major services (such as Sky 1 and Sky Sports 1). Studios are fed from the side of the building with the galleries in the centre.
The studios are different sizes, but the galleries are the same, to allow a gallery to work with different studios, for redundancy and flexibility. Not every gallery can point to every studio, but are linked in pairs. It also allows Sky to spread its resources better, whether incoming circuits or graphics.
“There are eight galleries. We’re firing up five initially and having three spare. I think very quickly we’ll switch those on. But the infrastructure has been put in to those galleries and those studios ready for it. So we know the bill of materials. So we know exactly the equipment that is going to go into those galleries. We know how long it takes to equip them. The only one thing we need is the business justification to switch them on.”
If they shoot 3D, they can use one gallery for 3D output and another for 2D output, or drive up a 3D truck and plug it in to one of two fibre connections on the outside wall.
“We wanted to be able to shoot from anywhere in this building. So we’ve got lots of areas of fibre connectivity,” with shooting areas on each floor. When a programme wants to shoot in its offices or one of the soft areas, it is easy to set up, thanks to the massive network infrastructure. This not only makes it simple to plug in a camera, but also makes it easier too for staff to edit on their desktop, using its Ardendo MAM system, or use a laptop.
It means that instead of an assistant producer going into an edit suite with tapes, they can compile clips on their own, edit themselves, give it to a craft editor (there are 45 edit suites – a mixture of Avid and Quantel), or do some more work on it at home or on location in an OB truck.
It is also building a Multi-Protocol Label Switching network between its OB trucks and Harlequin One, to allow transferring files, base band video, talkback, the telephone system – or anything else.
“They should feel that actually they are genuinely connected to the network, to the infrastructure, and if they want to pull content from the MAM here they should be able to do that without asking somebody. If they want to send content back they should just push that back into the MAM. If they want to press MCR it appears on their panel,” says Long.
It will mean working with outside broadcast companies to achieve that, and make tapeless seamless, so that every feed that comes in will be logged and then to make it easy to change that log. “We’re working with companies like EVS to build secondary logging tools” that link to EVS’ IP Director, and it should be just as easy to use at home as in Sky.
Although the primary reason for Sky’s recent purchase of The Cloud network was obviously to offer its customers access to WiFi services away from home, it has the spin-off of enabling its staff to work remotely more easily. “We’ve always looked at technologies in isolation. What we’re trying to do more and more is look at them as complete joined up architecture. And that is very difficult. It’s not something to be done overnight,” says Long.
“We’ve got to stop taking little steps. We’ve got to take bigger steps; we’ve got to go from the PC to the iPad type of thing, which is really making a difference, and that’s the only way we can keep pace. It’s also making sure that what we put in here isn’t onerous in the fact that actually it gives us too much legacy or difficulty to change things. We have to think about how we can change things quickly and be a bit more agile.”
For example, its apparatus rooms are all stacked above each other for ease of connection, via fibre, and it has a 3Gbps network to cope with 50p and 3D, and staff can work with 3D on the desktop. It currently works with DVCPRO HD (100Mbps), but will move to AVC-Intra, so all the infrastructure has to support that. However, many manufacturers hadn’t developed for AVC-Intra, so EVS and Quantel had to adapt their systems.
Technology choices
“What we try to do is choose the best technology, but there were a number of aspects which we made sure played into that – one of which was the sustainability features. So the whole building has been about sustainability, and one aspect of this is the sustainability of the equipment itself, for example using less power, but it also covers such things as packaging recyclability,” says Watters.
“All of those things were considered, as well as functionality and price.” Provided the required functionality was present then sustainability “was actually a fairly major weighting factor, and that was also true of everything else that was purchased for the building, such as office furniture.”
“We didn’t stick with any manufacturer just because we’ve always done it that way,” adds Long. “EVS might have been the only company really because of the way they work and because of the infrastructure.”
If something was too power-hungry or created too much heat, it was out. “Every single bit of equipment was measured for its environmental impact. From the power usage to the size to the heat output to where that product was built.”
When it started the process, it held seminars for manufacturers and told them ‘we’re looking for 50% decrease in the power output of your products.’ “That was a challenge, of course, and some achieved it and some didn’t. But we were pushing them to start reducing what they were doing.”
It wasn’t just an individual product but the whole chain. “One product might not equal 50%,” but savings on a camera could be combined with those on the CCU, storage, etc.
Also, if it’s running cooler it’s going to last longer as well. To help with this broadcasters normally have a completely cool equipment room, like a fridge, but Sky cools from under each bay. “We point-cooled” just the areas which were important. “We didn’t turn the room into a fridge, we turned the racks into a fridge. We also lowered the temperature of cooling because a lot of these rooms are just flat out air conditioning – running at the maximum cooling they can possibly do.” Sky told manufacturers their equipment had to run cooler. “We looked into, from a processors point of view and from an infrastructure point of view, how much cooling was genuinely needed for the equipment,” he explains.
Brand new matrix
Some technology had to be developed for the project. “The matrix that we’re using with Evertz is a brand new matrix; never been used before,” says Long. “Instead of having this great big video matrix in the traditional sense, we have multiple ones scattered around the building,” so that if one goes down they don’t lose everything.
The Ardendo system also had to be adapted and changed for the way Sky works. “There is so much of this infrastructure that is new in the way that we’re putting it together, and that gives it massive challenges,” particularly in having to work with manufacturers, so that they understand exactly what Sky needs.
Ardendo (part of Vizrt) is its main media asset management system, with editing tools on top of it, which will allow almost anyone to cut whatever comes into the MAM. “Everyone has that accessible on their desktop within a couple of seconds of that content coming in,” compared to having to record 14 to 20 tapes and distributing them, and multiple people doing separate edits.
“Now I can do an edit for two or three of the shows and that material is in one central place,” but they had to work with Ardendo to create that infrastructure. “Then we had to get the Ardendo network talking to the Avid system so that if I’m in an Avid suite I can pass my content from my Ardendo world into my Avid suite and back again.” The Quantel systems in Sky Sports News also link with Ardendo, as does the EVS system in the fast cut area. “We haven’t gone for that ‘one fits all’ scenario. They’re all linked. They all work with each other.” There is also some Final Cut Pro in use, but mainly on location.
“There is always going to be analogue coming into this building, but it automatically gets transcoded and put onto the Ardendo system.” If it’s not HD it’s turned into HD.
Sky chose Grass Valley for its cameras, buying 20 Grass Valley LDK 8000 Elite WorldCam multi-format HD models, having compared then with Sony’s HDC-1500. “Environmentally they were much more efficient. Also from a sensitivity point of view and also from a quality point of view, in comparison shoot outs, we were very happy with the LDK 8000.”
Sky also wanted to go with a new, very large vision mixer. Rather than buying 3ME or 4ME systems, it bought 5ME 96-input Grass Valley Kayenne mixers for every gallery, so they all operate in the same way and are fully flexible.
Sounds good
Sky has a massively interconnected audio system, building on its long relationship with Calrec, which provided the 88-fader Calrec Audio Apollo consoles. But the broadcaster put them together in a completely different way, with its own central digital signal processing systems with thousands of inputs, with multiple linked DSP systems. It is using Calrec’s Hydra2 network with two stand-alone Hydra2 routers (each with 32 fibre optic connections, with 512 simultaneous channels per fibre) in addition to the routing systems in each console, with about 15,000 input and output points.
“Every one of those mixers is linked. So if I was in gallery four, and my desk went down, I could just walk into the next gallery and I’m back on air again with gallery five. If I had a band in studio five I can go up to my fast edit suites, which have what we call the ICA [the International Commentary Area used for Champions League and Football First] and mix the band up there because it’s linked to the main DSP.” It is totally flexible. “If I wanted a bigger sound desk, I could use four and five linked together. In fact I can link all of them together. I can have thousands of inputs and thousands of layers of audio throughout the whole building.”
Sky has also moved away from Dolby 5.1 in Harlequin One, with discrete audio throughout the building. “We’re still doing Dolby 5.1 outside, so we’re still transmitting it, but we didn’t want to move Dolby E around this building.” The Dolby E signals will come in and get stripped to their component parts, with six channels available everywhere. It means that no one will have to worry about running out of Dolby decoders. It’s just another six sources. When it goes out it gets re-encoded.
Sky has more than 200 Wohler audio/video processing monitors, notably 131 AMP2-16V-3G 16-channel models, to provide integrated audio monitoring and processing, including loudness metering and control, audio mixing and routing, SDI audio embedding, and instantaneous Dolby bit-stream analysis (Dolby Zoom), within a single system. It is also installing 34 AMP1-16-3G systems, 45 AMP1A-Plus two-channel audio monitors, and three of Wohler’s latest AMP1-S8-3G multi-format audio monitors.
The most important aspect of the new build for Long has been how everything interconnects, whether the Evertz and Axon infrastructure, or the way that its Ardendo system works with Avid, EVS and Quantel.
“Fundamentally it should be a system that is linked by a common goal, which is to move content from here to there in the quickest possible time with the least amount of transcoding going on between the two of them. And that’s been our mantra all the way through it.”
www.atgbroadcast.co.uk
www.avid.com
www.axon.tv
www.calrec.com
www.chyron.com
www.evertz.com
www.evs.tv
www.grassvalley.com
www.quantel.com
www.vizrt.com
www.wohler.com