Suche
Kategorien
Getaggte Artikel
Impressum
Max-Planck-Ring 6d
98693 Ilmenau
Tel./Fax: +49 3677 691929
info@fem.tu-ilmenau.de
www.fem.tu-ilmenau.de
Vertretungs-
berechtigter Vorstand:
Vorsitzender:
Adrian Schollmeyer
Stellvertretender Vorsitzender:
Robin Lehmann
Schatzmeisterin:
Anna Brede
Stellvertretender Schatzmeister:
Maximilian Klook
Registergericht:
Amtsgericht Ilmenau
Registernummer: 120483
Datenschutzerklärung
Verwaltung des Blogs
Monday, 2. January 2012
28C3: Review (englisch version)
By wide request of the crowd here is a translation of the 28C3 review article.
It is done already. After two months of preparation, 13 people attending the 28C3 and others outside have done the streaming & recording of the 28C3. The 97 events were streamed in 7 different formats, with 27 concurrent streams in total and distributed by 39 relay servers. HTTP live streaming alone generated more than 10 TBytes of traffic. The raw video data takes about 1.8 TBytes on our harddisks.
Day 1
In beginning of the first day streaming was characterized by problems with the network outside the BCC, which also created the most traffic for our support tickets during the whole 28C3. The network problems were caused by dropped peerings. As the network had different uplinks, some of them happend to get disconnected by incomplete routing tables. It was fixed, at least for us, by the NOC by assigning our subnet to a stable uplink. As we never encountered that problem again on the C3, the stream went outside the bcc without any further problems.
We also had to explain why the HTTP Live Streaming, a new service this year, did not work on some devices and media players: Except from products from apple and the VLC-1.2pre release, no player/browser supports that format, maybe because it is relatively new. But with its architecture it was really simple to generate additional playlists and reuse the live streaming snippets for on-demand streaming.
Day 2
On day 2 we have started with the release process and were really productive from the beginning, so we were able to release some recordings two hours after the end of an event as torrent file, a personal record. This worked pretty well until the end of the congress, at the end we released all audio files before we left Berlin.
Also some script kiddies had some fun and kept our streams overview site from working with a lot of concurrent connections. Unfortunately this was bad timing, as both of the webmasters were outside of the bcc, grabing something to eat and only had their voice mail answering us. When one of them finally got back to our VOC, he found over 17 Gbytes of error log, which he archived instantly to /dev/null. After some webserver configuration tweaks the page was back online, performing well. For the remaining time of the congress it didn't cause any more problems.
A nice piece of our infrastructure is the slides-only stream, displaying slide snapshots at maximum rate of 2 per second, which is also available while other streams show the pause screen. So it happened between two lectures that a blue screen was displayed for several minutes - we actually stream everything.
Day 3
During a C3 it is usually quite warm in the VOC, because we are door to door with the server room and because of being on the highest floor of the bcc, the warm air is coming upwards to us. On day 1 and 2 it was still comfortable, but on day 3 the temperatures were rising. One of our H.264 encoders did not like that. As a first countermeasure we removed the cases of the Shuttle Barebones and moved our ventilator from the VOC to the service room to increase the air flow. This caused a cooldown of the encoders by 12°C, but did not solve the actual problem. After some investigations we suspected the HDD of doing nasty things, replacing it however solved the problem for the remaing congress.
Furthermore on day 3: the traditional fefe peak! The "Fnord-Jahresrückblick", one of the usually most awaited lectures, brought us a peak of at least 3.500 clients watching the streams of Saal 1 concurrently. Sumed up with the other lectures there were at least 4.100 clients watching the streams at this time (again personal record!).
At the end of the day we did some hands-on during the concert. We wanted to contribute our experiences with cameras and live video mixing to get some motion into the video, which can be essential for a concert.
Our Setup
There were some modifications to the setup during the congress. We extended our streaming with additional formats, some of them planned before and some of them spontaneously. At the beginning of the congress we were streaming in WMV, H.264 via RTMP, HTTP Live Streaming (H.264 in MPEG-TS) and slides-only. Additionally the streaming team of CCCamp2011 set up some infrastructure for Ogg/Theora that was fed with our signals. Later there was an audio-only Ogg/Vorbis stream that was embedded in the webpages of the slides-only stream. One of the spontaneously evolving streaming formats was the ASCII stream, playable via telnet client, that was on top of the wish list in the 28C3 event wiki.
There were quite some ressources necessary for monitoring all the signals and servers. The visual monitoring of the video signals from Saal 1 to 3 was done with 6 SDI CRT monitors being placed in the VOC. For relaying the RTMP streams we used a small tool called crtmpserver, which had the bad habit of stopping distribution when there was no connection to the stream source, which could only be fixed by restarting the daemon. To get aware of such situations, a dedicated laptop showed a webpage with all RTMP streams from all relay servers simultaneously. The 36 flash players in total consumed so much CPU power that the streams degraded to slide shows, but it was just enough for monitoring. For RTMP distribution the original plan consisted of a mix of erlyvideo and crtmpserver, but during 28C3 we migrated all relay servers to use crtmpserver since this software showed a much better performance than erlyvideo did in 2010. Only the relay for the "No Nerd Left Behind" stations was left running erlyvideo, since the number of clients was low enough to benefit from the uninterrupted delivery that erlyvideo can provide.
In the meantime our infrastructure was partially setup again in Ilmenau to complete the encoding of the remaining files. These should be all available by now via FTP, HTTP and Torrent. See Documentation page in the event wiki.
We love Cats
In the night between day 3 and 4 we were streaming the nyan cat to make some important latency tests (the nyan cat page shows a seconds counter) and reached a score of 33'904.9 seconds. The stream was distributed via DVB-T to the TV sets in the BCC as well and was playing through the night (partially with audio) which cheered some people up. It was noted in the subsequent Q&A session that FeM supports "No Cat Left Behind". By the way: no animals were harmed during the test. (But maybe some humans)
Feedback and support requests
Our main purpose was the streaming and recording of the events, but we also got some slightly unrelated requests. E.g. we got calls from people which needed some special cables. And we got endless questions about the track which was played between the events. For everyone who still doesn't know: it's "Machine Lullaby" by "Fear of Ghosts". The question why $player does not support HTTP Live Streaming was answered already. Most problem reports about non-working streams were not as easy to answer as the reporter may have thought it would be (see stats in the beginning of this article to get an idea about it). Sometimes it was just a bad connection or route on the client side. Or the bad idea of watching streams and leaching files simultaneously.
But we did not only receive support requests.
Especially some comments of viewers, which let us smile: "Internet works. WLAN works. Streaming works. How shall we develop a congress feeling with that setting?c3streamingc3infodesk No, you're great!"
Many Thanks
We would like to thank a few organizations, which made it possible to build and use this year's streaming and recording infrastructure. In specific: ErgoDATA GmbH, Selfnet e.V., ATekoN e.V., Hetzner AG, 1HE-Server.com, the TU Ilmenau - notably the computing department and their chief network technician, the Institut für Medientechnik and the room equipment team - and others, not named specifically.
Also thanks to the CCC for running this well organized event, the FOOD team for not letting us die of starvation, the NOC team for their tremendous work, the guys from BCC technical services for helping us anytime and Nick Farr for entertainment and Club-Mate.
We also want to thank all those who were pointing out problems while we were busy with streaming and solving occurring problems. We always tried to monitor everything, but did not always had the time to watch, so it would not be possible without external help.
And we thank all we've forgotten to mention, all speakers and all viewers outside on the internet. See you probably at 29C3, whereever it will take place.
If you liked our work, you may consider to thank us with a small donation.