2/21/17

Facebook Live using FFmpeg

You may have see that some FaceBook Live posts look better than others. More like they are shot in a studio and not on a cell phone. Well, that's because they are. There are various hardware devices and web streaming services that can do it but I'm going to show you an easy way to do it for free (assuming you have the needed computers and cameras etc).

Step One. Setup a computer with FFmpeg. FFmpeg is a free (opensource) software that is described as "A complete, cross-platform solution to record, convert and stream audio and video." and it is very powerful. With it I have converted video files, recorded video files, ripped audio from a video, and a few other things. Today we're going to take a live stream from an Axis IP camera and then stream it out to FaceBook Live using just one command on the command line. So, I'm using the Windows version of FFmpeg which you can get here https://ffmpeg.zeranoe.com/builds/ or you can get Linux, Mac, and any other build from http://ffmpeg.org

Step Two. After you've gotten FFmpeg installed it's time to setup your video source. Being that we're talking about FaceBook Live, we're going to choose a live source. For that I'm using a standard definition Axis encoder Q7401 connected up to the churches AV system through a series of converters to our very expensive HD cameras. This process would work the same if you were just using an IP camera as well. So, for me the RTSP string from the camera looks something like this rtsp://10.x.x.x/Axis/media.aspx I would recommend testing this out on VLC Player to make sure you get the syntax correct before moving on. There is an awesome site that will tell you just about every connection string to every device here.

Step Three. Once you've gotten the correct syntax for your camera, you need to feed that into FFmpeg and tell it to do something. So, let's take that input and stream it out to Facebook Live. To do that we first need to schedule an event and get the one-time connection key. Unfortunately you cannot use this method with a personal Facebook page. You have to have a "Page" to be able to do this. Once you are an admin of a page, you will get more options at the top like this:

Click on "Publishing Tools" and then click on "Videos" on the left menu. When you do you will see a screen with past videos if you have any and on the top right you will see "Live"  Next, you will see a page like this showing your connection string:

This page tells us exactly what we need. We'll use the top "Server or Stream URL". Don't worry if you didn't get it before you click next because it will display again. Take note to the note at the bottom. Once you click next you have 5 hours to go live then it will no longer be valid. Click next. A new screen comes up that asks for you to name your event and then allows you to preview your stream before you go live.

On this screen you can see your stream key again and name the video.
If we had an active stream it would show up here. It won't let you click the "Go Live" button until you have a stream going. But you can schedule a future stream by click the dropdown arrow and picking "Schedule Live". That part is cool because it puts a blurb on your feed saying that your going live at a specific time which could help drive traffic to your broadcast. For now we're going to just do the Live option so let's get your feed going.

Step Four. Now is when the rubber meets the road. Open up a command window (on Windows 8.1 or 10, press Win + X key and then select Command Prompt). At the prompt we're going to use the following format:
  ffmpeg -i  -f flv

Facebook Live is also picky on the audio and video settings. You can view the requirements here. To get this to work in my situation I found that I had to do some work to my stream to get a reliable output.
ffmpeg -rtsp_transport tcp -y -i "rtsp://10.0.0.99/axis-media/media.amp?streamprofile=fblive" -t 5400 -c:a copy -ac 1 -ar 44100 -b:a 128k -c:v libx264 -pix_fmt yuv420p -r 30 -g 60 -vb 2048k -minrate 2000k -maxrate 4000k -bufsize 4096k -threads 2 -f flv "rtmp://rtmp-api.facebook.com:80/rtmp/133333333333?ds=1&s_l=1&a=ATg99929999x84tR"

As you can see there are a lot of options that you can customize with FFmpeg! In a nutshell, I created a different "Stream Profile" on my Axis device with the source settings I wanted like if it should display the date at the top, the resolution, the frame rate and bitrate, and some other settings. Then I use the -t option to run the script for 5400 seconds or 90 minutes. I copy the audio stream and set the sample and bit rates. For the video I had to run it through the h.264 encoder and set the color space as well as the frame rate and the min and max bitrates. I also found that I needed to dedicate two processors to this to keep up with the transcoding on my computer with the -threads 2 command. Then I pipe it to the -f flv command that sends the flash ramp stream to the Facebook API with my streaming key.
If I press the enter key and run the script it will throw up a bunch of stuff in terminal and then it will settle down to something like this:

You may see some errors when the feed is just starting out but they should go away and look like the bottom half of this command prompt window. If you go back to your web browser and Facebook Live scheduling you should see the window has changed to have "Preview" in it and showing your video! You will notice that the "Go Live" button is now clickable. You can click it to start right away or click the down arrow and schedule it for later. Just make sure your feed is scheduled to go long enough or don't put the -t option in at all. You will just have to press Ctr + C in the command prompt to terminate the feed when you want.

Some additional notes:
  • If you lose your stream for any reason the Facebook live event will stop and post to your timeline.
  • This is a one time key. You will need to do this process each time. There is probably a better way of doing this with an API but I haven't gotten to that yet.
  • If you don't want the live post to be viewable after the event ends you can select "Unpublish after live video ends" This will preserve your stats and allow you to post it later if you wanted.
  • If you have any prerecorded music or video in your stream Facebook will most likely not catch it in the live feed, but will take down your replay soon after it is posted. We've had this happen many times. You usually have the option to post it if you agree that you have the proper copyright information, but it's just a real pain. The better option would to send a different feed to the live stream that doesn't have audio from Spotify or other music. I imagine they send the video through something like Shazam app to identify any copyrighted music.
  • You will get better quality with a better encoder obviously. One weird limit is that FBlive only goes up to 720p right now so you'll have to downconvert if you've got a 1080p stream.
So, this is the low cost, DIY Facebook Live solution. There is software out there to do it for you like Wirecast but many cost much more and don't offer much more features mainly because Facebook has it so locked down. 








5/21/15

Remove Duplicate Exchange Objects on Office 365

Remove Duplicate Exchange Objects on Office 365


If you are doing a migration to Office 365 or an upgrade between Exchange versions, then you may end up with duplicates in some or all of your Exchange accounts. This happened to us after moving from Exchange 2013 to Office 365 when some things went wrong (future blog post coming I promise) but I wanted to get this down before I forgot it.

So, I've got about 100 users that now are on Office 365 (Exchange Online) and have duplicates and I don't want to manually go to every computer and try to fix it. Also, I don't want my staff to have to jump through a bunch of hoops to try and fix it (which will probably cause more work for me later). If only there was a way I could script all this...

Turns out, someone has! Michel de Rooij has created an AWESOME PowerShell script called Remove-DuplicateItems.ps1 and you can read more about it here. There are a lot of options and switches that you can use with this script and some are documented better than others.

In my initial testing I found that this wasn't removing duplicates even when used in "Full" mode and not "Quick" mode. It turns out that the way the 3rd party tool that we used to sync our accounts in our Office 365 migration changed the ID's of the items and even items that looked to match perfectly were measuring larger in the system (two attributes that are used when comparing). So, after some head scratching, I found that I needed to comment out a few lines of code to get it to work in our situation.

On line 357, I commented out
#if ($Item.Size) { $key+= ","+$Item.Size.ToString()} 
by putting a # in front of the line.

And then on "IPM.Contact" section starting at 359 I commented out a few lines as seen below:
This was done because the way our duplicates were made the Size comparison was failing because the two records weren't the same. Also I found that for whatever reason CompanyName, and the phone number fields caused some issues for me. Make sure you test this script on a small set of users because there can be some false positives when you comment these out. The good news is that you can use the -DeleteMode MoveToDeletedItems option to have them got to the Deleted Items folder and could be easily restored. So, awesome stuff but how does it work?

  • First you need a computer with PowerShell installed (it doesn't have to be the Exchange server) as well as Exchange Web Services (EWS) Managed API 1.2 (or later) installed. 
  • You need a .csv list of the users you want to remove duplicates from (I recommend several batches of less than 20 because it does take a while and only does one at a time and will stop the script of a box fails)
  • If you're running this against Office 365 you need to have setup a user with Impersonation rights on the mailboxes (outlined in the source blog)

Now, run the script. (My comments are in green)

# Set credentials for the next scripts - If using Office 365, make sure you 
# use the username@domain.com

$UserCredential = Get-Credential 

#The next command references a script called Remove-DuplcateItems.ps1 which can be
# downloaded at http://bit.ly/1IRgTnc 

PS C:\Users\Administrator> Import-CSV Users.csv | .\Remove-DuplicateItems.ps1 -Type All -Credentials ($Credentials) -Impersonation -DeleteMode MoveToDeletedItems -Mode Full

When it's running, you will see something like this:


A helpful option when you are testing this out is -Verbose which will outline the task that it is doing and ask you what you want to do with each delete. When you are doing a batch it's not very helpful. You can also use the -Debug option and get in-depth look at how the script is comparing things (helpful in determining why some duplicates are getting deleted and some aren't)

I spent many hours trying different scripts and figuring out why it wasn't working but I did get it to work. I hope this blogpost helps someone else out there and I'm sure I'll look back the next time I have to run this :-)

7/25/13

The Great Migration to Exchange 2013

We upgraded to Exchange 2013 this week from Exchange 2007.  We have wanted to upgrade almost since the day we installed Exchange 2007. Not because it was bad, but because we installed it only a few months before Exchange 2010 was RTM (long story for another blog post). We are also hoping that the greatly enhanced Outlook Web App will help with our volunteers and BYOD users.

2003 to 2007 was a big project and I did it all last time. There were many unforeseen issues that took a lot of time to resolve. It was not something I wanted to repeat without help. This time I enlisted the services of a friend and professional who was experienced with this kind of thing. Ed Buford is someone I have know for a while and works for Pinnacle of Indiana.

I did what I could to cut down on the consulting costs. Upgraded our servers to VMware 5, downloaded and installed Server 2012 Datacenter, updated said server, downloaded Exchange 2012 CU2. Ed helped with the more technical stuff like prepping Active Directory, checking the configuration on Exchange 2007, Installing and configuring Exchange 2013 using best practices (better than mine since Exchange 2013 is still real new some things were kind of buggy).

Then came the migration. We had planned to do this on a Sunday afternoon because for us that is the lightest day of the week and would have the least impact on our staff. Ed and I figured it would take 4-6 hours to transfer our 260 mailboxes. We were wrong. After transferring just my mailbox as a test (which was about 4 Gb) we saw a problem. It was only transferring at about 11 Mbps and took about an hour. Exchange 2007 server was only pushing a few percent on CPU and network but the Exchange 2013 server was almost maxing out the two vCPU’s. I decided to move this to a different host and try again, this time with 4 vCPU’s. This batch we ran about 10 accounts and found that it would only transfer about 4 to 6 at a time. All 4 CPU’s were maxed out. After that batch I shut down and added two more (this host only had 8 cores). This next batch of 50 mailboxes went a little faster and would transfer 6 to 8 boxes at a time. We figure this was as good as it would get and queued up a few more batches and called it a night.

In the morning we saw that the boxes that had moved were accessible by OWA and ActiveSync. Boxes that hadn’t moved yet were only accessible by Outlook since we changed our autoconfigure settings and certificates. I Queued up the last batch but saw that we had a problem. They weren’t moving. After some exploring we found that restating the Exchange Transport service got things going again (it said it was up). We figured it was a fluke and continued with the migration. By the end of the day, all mailboxes were transferred, Activesync devices were syncing, Outlook was connecting, and Outlook Web App was working great.

The next day I discovered that at some point in the night/morning, we stopped getting outside email. Our hosted barracuda was saying that it had delivered a lot a messages but where did they go? We still had email routing through our old server so I checked there and saw that we had 1590 messages waiting to be delivered!! We restarted the Exchange Transport service on Exchange 2013 and watched as all of the messages were delivered in about 15 seconds. After this we moved inbound email over to the new server and thought we were done. Nope. This would continue to happen again and again over the next few days, but now, messages would spool on the Barracuda (thankful we use it for this reason!) and then deliver once we restarted the Transport service. Ed found that others on the web were experiencing this same issue outlined here.

After deleting the old receive connectors as people in the above TechNet thread suggested, and only having the default ones, it looks like the issues is gone. The problem is I NEED those connectors. So, I’m going to add them back one by one and see what happens.

So, in summary, these are the things I learned about migrating to Exchange 2013

  1. Communicate well to your users. I’m not sure if you can over communicate, but you want to get as close as you can. No matter what you do there will still be those saying “Oh, was that today?”. Since email would be down for some during and after the transition for some, we setup a blog they could go to for updates and directions.
  2. Plan a lot of time. More than you think.
  3. Make sure your new exchange server has plenty of processors (at least for the transition. You can drop a few after that.) More processors = faster migration.
  4. Be prepared for something to go wrong. In our case, we already had outside help queued up. If you’re doing this solo you should definitely do some more tests before the big “Moving Day”.
  5. Plan for issues with certificate and namespace if you changing those in any way with your migration (you probably will). Android devices seemed to have the most trouble with this since they all handle things a little differently depending on their OS version and device model. iOS devices were pretty predictable. Once we knew what change we had to make they were all the same.

I’ll try to update this blog in the coming weeks as we get used to Exchange 2013 and also note any issues and how we resolved them.

Issues

  • Users accessing our OWA site via HTTP are not being redirected to HTTPS. We have tried just about everything we can and it still won’t work. If you have figured this out, let me know! Please!

4/12/13

Simulcast 2.0

Back in 2010 I told you about how The Chapel had been using the latest HD video technology to be “one church in many locations”. We are still a mutli-site church and have grown from our initial 4 video campuses to 8 campuses by 2012 (5 with live video & 3 with “tape”). Our current video simulcast solution was working great for the 5 campuses that were on our fiber network but we had not moved the other 3 to live video for a number of reasons.

Some of the problems with our current solution:

  • Expensive – Joining another campus to our fiber network to enable them to have live video would require thousands of dollars of network equipment and thousands more to get the fiber into the building and terminated.
  • Flexibility – Singing a 3 year contract on fiber is about as fun as taking out a mortgage on a house (and hurts the pocket book about the same!). The Internet and network world is constantly changing so why get locked into something?
  • Scalability – To this date we had been lucky that all our locations are within the same Chicago metro area and could be serviced by AT&T’s Opt-e-Man product. But what happens if we want to cross the boarder into Wisconsin or downstate Illinois? Or another state or country? Our telco broker warned us that that could be an expensive problem in the future as it would most likely require an even more costly circuit.

I knew that I didn’t want to get caught off-guard when it came time for renewal in 2013 and be forced to resign. So I started working with our telco broker early in 2012 and I am glad I did! We heard pitches from several of the top vendors and to my surprise most were even more expensive than AT&T!

Throughout this time I had been talking with Chris Kehayias at Calvary Chapel Melbourne who introduced me to Zixi at the Church IT Round Table event they hosted in 2011. Zixi can best be described as a transport service specializing in video. Zixi ensures that my video gets from point A to point B without dropping a packet. Chris and others had been using it to deliver video in a point to multi-point church environment with amazing results. After getting a demo setup in our environment we were sold! The great thing about how Zixi works is that we really didn’t have to change our workflow, encoders/decoders or even bitrate. Zixi just “dropped in” seamlessly.

We have now moved most of our receive campuses to Zixi but still maintain a fiber connection between our two broadcast campuses, Grayslake and Libertyville. Our receive campuses each have their own 27/7 Mbps Comcast coax internet connection. So far this has worked great for us and is able to keep up with our two HD video feeds running at around 16 Mbps. Our send site uses a 40/40 Mbps Comcast fiber Internet connection for the upload.

We had to make some network changes since we were going from a fiber point-to-point system to a VPN system. For that we are using SonicWall TZ-200 & TZ-205’s at our receive site and an NSA-240 at our send site. The NSA-240 seems to handle the video just fine but is struggling to keep up with other traffic so we are in the process of upgrading. The TZ-200 & 205’s are doing just fine at the receive sites though.

Even with these changes we should see a 43% yearly reduction in our simulcast and network cost! We are now able to get all our campuses live video for less than we were paying for just 5 in the previous model. We are also to setup a video campus anywhere we have a decent internet connection.

Earlier this year at the Spring National Church IT Roundtable even I gave a “Ten Talk” about Zixi and most of what I detailed above. You can check out the video here on YouTube.

Also, this is a link to the presentation slides.

If you have any comments or questions, leave them below or catch me on Twitter

3/13/12

DIY DVR

 

Tired of high cost cable or satellite but don’t want want to give up your DVR? As we were tightening our financial belt, the Dish subscription went and so did our DVR. We get a lot of HD channels over the air (OTA) in our area, but most of the shows that we want to watch are starting just as we’re putting the kids to bed. My solution was to build my own DVR.

My Setup

  • Tuner = HD HomeRun – Can be had now for around $100 on Amazon
  • OS = Windows 7 Pro x64 – Running Windows Media Center
  • Computer = HP Pavilion a1430n – Started out with an old Dell P4, this is much faster
  • Video Card – GeForce 8800 GTS 320 MB given to me by my brother – Using a DVI to HDMI converter
  • Remote Control = iPhone. I found an app that lets me control Windows Media Center from anywhere in my house called Remote Kitten. Lame name but it works.

I went with the HD HomeRun dual tuner for 3 reasons:

  1. More future proof than other solutions. Plugs into your network and can be access by any computer on the network.
  2. Can be put anywhere in the house.
  3. Decent Price. More than other dual tuners but it makes up for it in flexibility.

The Verdict

I love it!

Quality - The quality is amazing.  Looks just like watching it live. The processor seems to keep up but I think the video card makes the difference. I think it would be even smoother on the WMC menus if it had more RAM (2 GB) and if it wasn’t just DDR.

Ease of use – Windows Media Center is great. The only thing that would be better would be if it tied into iTunes. Can control things from my iPhone or iPad.

Detractors – The only real downside is the old computer. Would be sweet to have a small HTPC with a modern processor and native HDMI support. Also need to get a larger hard drive if you want to keep anything like movies or whole series around. The 200 GB hard drive fills up fast! (200 GB = about 20 hours HD)

My experience with this home built DVD has me wondering if something similar could be but together on a budget for church related streaming. Anyone know of one?

1/24/12

Multiple subnets with one VMware ESXi host

As we’ve moved more and more of our critical infrastructure at The Chapel to the virtual world, I’ve struggled on occasion with the issue of setting up network cards in VM’s to work on different subnets.

This became a real issue when we migrated from our Cisco phone system to our virtualized MiTel phone system. All was good until I needed to setup the “MiTel Boarder Gateway” which acts as a firewall and SIP gateway for the phone system. Since I had to get this up and running quickly I just installed another network card then mapped it to a virtual switch in VMware and mapped the second NIC in the VM to that virtual switch.

image

This approach however is not very efficient or redundant. It also takes up valuable NIC’s and switch ports. My plan is to update this configuration with what I’ve learned when setting up our print server to work with FingerPrint which I’m going to detail below.

How to setup Vlan tagging in VMware ESXi

  1. First, you need to have a working ESXi host. The setup isn’t that hard but is more than I’m going to go into here.
  2. Setup your switch port(s) that connect to the server as a “trunk” in Cisco speak with a “Native Vlan” set to what a majority of your servers use. That way you don’t have to setup tagging on every vNIC.
  3. If your looking to have a server that needs to talk to two different subnets like a firewall or my print server running FingerPrint, add another Ethernet adapter to your VM and assign it to your default network. Mine is “VM Network”.
  4. You need to check that the Virtual Network on your primary vSwitch allows all Vlans. By default it is set to “None(0)“.  Set it to “All(4095)” or just the ones you want.
     image
    image
  5. Now, start up your VM and log in. Navigate to the Device Manager and select the network card you want to configure a different Vlan on.
    image
  6. Once you configure the tagging, make sure that you have the IP addresses setup correctly. For a firewall type VM, you will have different IP’s and gateways on different subnets. If you have a server connecting to two private networks, only set a default gateway on the “Primary” network. Windows doesn’t like it if you set different gateways to the same routed network.

That’s it. Now your servers can use different and special Vlans when needed and you don’t need to add another NIC or vSwitch each time. In my case, it allowed me to easily setup FingerPrint to communicate with our wireless network with the Bonjour protocol.

For my friends that are more versed in VMware than I, please post your comments and questions. I’m always interested in what others are doing or what the “Right” way is.

Apple AirPrint and FingerPrint

So you have a shiny iPad or iPhone and want to occasionally print. Sounds simple, right? Well Apple has your back and has come out with a great “New” feature called AirPrint that will fix all of that. That is if you have one of the few new printers that have AirPrint.

A lot of companies are seeing more iOS devices on our networks and more users who expect new features that Apple comes out with to just work. They have little patience for us or the market to align ourselves with the Apple way of doing it. We also have some expensive, high efficiency, and feature rich printers and copiers on our network that we can’t afford to just replace.

Our first solution was a hack program called “AirPrint Service for Windows” that worked well until iOS 5. Then it broke.

There were some other iOS apps that let you print to network printers but they cost money and you have to pay & install it on every device.

Earlier this month I found a program called “FingerPrint”. You can check it out at http://www.collobos.com/  They have a Windows and a Mac version and also provide a free one week trial. If you still like it after the trial, you can buy it for $10! I wondered just how well a $10 application could work but read on.

After running the quick install file and selecting the printers I wanted to share, I was up and running!

image

It also has a cool feature that I’ve yet to try that allows you to “Print to DropBox” where you can tie it to a DropBox folder. This could work on a personal computer but I don’t see it working on a network print server well.

Once things are installed and working, connect your iPad or iPhone to the wireless network (it has to be same subnet though. Stupid Bonjour!), and your printers will show up in the “Select Printer” dialog. 

image

That’s it. After the free trial I bought it. It’s working great so far. I’ve not tested this on multiple print servers on the same subnet yet so I’m not sure how that would work.

There is one civet though. Your print server has to be on the same subnet as your wireless. This poses a problem for most of us that have an enterprise wireless solution and have it on a different subnet. During the trial period, I setup a Linksys AP on the same subnet as the server and it worked fine. But it kind of defeats the ease of use I was going for when people have to connect to another wireless just to print.

I’ll address how I got around this limitation in my next blog post about connecting virtual servers in VMware to multiple subnets.

Read about it here.