INTERACTIVE LIVE VIDEO BROADCAST AND ITS TECHNOLOGY
At the beginning of November 2018, I decided to overhaul my Hubblescope live video broadcast Android software, which I wrote 2.5 years ago and was downloaded for 1 year in the Google Play Store.
This software not only supports live video, but also supports live video calls, just like Turkcell or skype. There is also a fun section that we can call light snapchat, and it is a 15-second object that you choose, such as beard, glasses, etc., and it displays both a picture of all of them and a 15-second video presentation. There is also a separate video of it on my youtube channel.
First of all, my aim is to keep all my valuable projects up-to-date in terms of libraries and compilers, and to bring them closer to a faster, more effective, less error-prone, robust structure. Also to refresh my knowledge. The purpose of this article is to share all this with you.
In terms of this project, 2.5 years has caused quite a change. Everything from the Android Studio version to gradle and other repository, the old libraries I used, the ones that didn’t work, gave an error in the first compilation. Fixing these and, more importantly, making the rtsp broadcast library more robust and making some additions and removals in the codes and software were also what I needed to do..
I worked for more than 10 days, sometimes 12-14 hours a day, and finished it on November 12.
Let’s come to the software.. Another article about Hubblescope is on my blog. What features etc. But I wanted to share a little more information with you. Let it be a little more coding and technical, but without boring other readers.
You can find this latest video presentation on youtube. The youtube address is already at the beginning of every page of this blog. I’ll post it here again:
*** ALSO: You can see the images more clearly by right-clicking and opening the image in a new tab, etc. to see it larger.
First of all, I used MS SQL 2012 as a database.
In Web API, I used C#. WebServer IIS 8.5.
On the Android side, I used the retrofit library for rest services and interfaces that will talk to this API.
I used WOWZA 4.3 standalone as Media Server and Pubnub as XMPP.
I used the rtsp protocol for all live broadcasts and/or playbacks.
I used twitter fabric for authentication and crashlytics for errors.
When you combine all of this with a suitable modeling and design, this software emerges.
The CONFIG table, which holds the database and especially the configuration, is very important. Below you can see the image from the 2012 SQL Server in the VMWare virtual machine.
Now, let me show you this database as a Business Entity with a logical class diagram on the code side and of course the main API project as a picture below.
In other words, the work starts from the database and continues to the underlying software parts. It may be more correct to write the android software first and then write the API. API is our interface layer between our android application and database. We will make the request here from the android side and with the codes in this API, the information will be retrieved from the database and sent back to the android application.
And now in the picture below I will try to show you how to call this API on the android side with a few steps and a few pictures.
First, let’s show the code of our Rest service. All data will be in JSON format from android to API and back.
After that, let’s show the LIVE class, which is the equivalent of the BL class, which corresponds to the relevant table in our database.
After the LIVE class above and the rest service, it’s time for the rest service interface. Picture of it below:
The main purpose is to use this interface to create the APIService object in the rest service class, and to call this object to our API, that is, to call our commands in the interface that will do the job we want and enable it to communicate with the API on the other side.
Now, look at the rest service picture above and the pictures and explanations up to here in the android code, I want to show you how we use it by giving you 2 pictures:
Now let’s come to web server a. I have defined 2 webs on IIS 8.5 and I will call them from the software over certain port numbers. First, let me give you the picture of our API on the web admin side:
And now we make the definition of the web where the thumb images will be placed, as in the picture below.
On the Android side, when the broadcast starts, the method called dobroadcast is called and everything starts.
Here are the pictures:
The job of connecting to the XMPP Server called in the above code is in the picture below:
Again in the dobroadcast, the picture of the process of taking thumb pictures from the image buffer and sending them to the API as a string, and converting them to images on the API side, and putting them in the relevant directory is below:
Finally, if the person presses stop, the codes of the jobs to be done are in the picture below:
How do you send and receive hearts and messages?
Picture of our code to capture touch on device screen first:
AND the code image of the sendheart method called from within this code:
Sending heart as above.
So, how do those who watch this live broadcast win the hearts at the same time? In my flying hearts article, I shared the entire code of this animation and wrote tips on how everyone can see it in such an application. Here is its code picture below:
How to send and receive messages. In other words, how the viewers send messages or see the sent messages while the video is live. Here’s how to send the code picture first and then receive it:
Here is the link of the test video of the working software with 2 phones:
That is all. Of course, the job doesn’t end with that, I haven’t shared hundreds of lines of code with you yet. Now, I think you can easily do that logic if you dare to write such software.