Join us for an Apache Kafka® meetup on October 15th from 6:30 pm, hosted at Makers Tribe in Chennai! The address, agenda and speaker information can be found below. See you there!
6.30 pm - Registration
6.40 pm - A journey from a legacy way of Video Ad Tracking to Kafka based Video Ad Tracking - Ganeshkumar Ramachandran (Gramcha), Pando Corp
7.40 pm - Networking & Dinner
8.00 pm - Group Photo
Ganeshkumar Ramachandran (Gramcha)
Gramcha is a Principal Software Engineer at Pando Corp, doing Truck Science. Previously he has worked as an engineer for Ooyala and ADF Data Science, where he gained vast experience in Java, Docker and Apache Kafka. He is a frequent visitor of JUG and gives talks about Microservices, Data Pipeline at different events. Most recently dedicated his time to digitizing logistics at scale and in the process of improving the supply chain of logistics using Kafka. In his spare time, he enjoys spending time with his family and friends.
A journey from a legacy way of Video Ad Tracking to Kafka based Video Ad Tracking
The video Ad tracking events are not much different from non-video Ad tracking events.
The non-video Ads typically have trackers like delivery, impressions, clicks, installs(in case of mobile app Ad). The video Ads will have trackers like delivery, impression, 10%, 20%, 30%, ... 100% completed.
In our case, the existing video Ad delivering system captures the trackers and that is not scalable solutions if millions of video Ads are watched simultaneously. This is a very unique case where you will have millions of concurrent users watch video Ads simultaneously similar to Youtube.
Our client is an OTT media service that streams cricket matches and shows video Ads in between each over of the match. Generally millions of users watch the match concurrently and some high profile matches will have over 10 million concurrent users!
Each break will deliver 3 or 4 small duration Ads and each Ad will trigger 10+ tracker events to Adserver. These trackers are handled in an old fashion way of batch processing. We have converted that into realtime data processing using Kafka.
We will see what the old solution was and how we converted that into Kafka based solution without disturbing the existing legacy system in a detailed way.
KAFKA SUMMIT SF 2019: 30th September till the 1st October -
We are able to offer you a 25% discount on the standard priced ticket for Kafka Summit San Francisco (September 30th & October 1st). To redeem it, please go to bit.ly/KSummitMeetupInvite, click ‘register’, select ‘Conference Pass’ and enter the community promo code “KS19Meetup”.
If you would like to speak or host our next event please let us know! email@example.com
NOTE: We're unable to cater to attendees under the age of 18. Please do not sign up for this event if you're under 18.