Live streaming has become most popular across the world through smartphone applications. Many applications like Periscope, Meerkat, Facebook live etc. had become very popular. Social platforms are focusing on the various technologies to make their step forward in business and success. Recently, Twitter was developing the new technology which recognises the happenings in live video automatically.
The Twitter’s highly developed learning systems named as cortex which recognises the labelling moving images in the live video streams. It is the very impressive step which just requires a deep learning with enormous computing power. With this new technological step with cortex, The platform was planning to develop a highly comfortable and sophisticated system for filtering which can curate the variety of content that shared via the Twitter platform. It depends on the activity of sharing the content in the past. The platform is presently testing the technology in cortex via Smartphone application, periscope. It can filters the copyrighted content too.
Recognizing the substance of live video is an entirely first trap. Analysts have gained noteworthy ground lately with calculations that can recognise objects in photos, yet it is significantly harder to do with a live video of fluctuating quality. To do it right away likewise requires impressive registering power. Twitter viably assembled a custom supercomputer made entirely of representation handling units (GPUs) to perform the video grouping and serve up the outcomes. These chips are particularly productive for the scientific counts required for deep learning, yet regularly they are only one a player in a more significant PC framework.