Deep learning applications are now truly amazing, ranging from image detection to natural language processing (for example, automatic translation). It gets even more amazing when Deep Learning becomes unsupervised or is able to generate self-representations of the data.
This post is about how to ingest data from different kinds of file systems by means of Kafka-Connect using a connector I’ve forged recently.
On March the 26th 2012, James Cameron and his submarine craft, Deepsea Challenger, explored the depths of the ocean down to 11km under sea level at 11.329903°N 142.199305°E, an infinitesimal point on the surface of the Earth’s vast Oceans.
Our participation at Spark Summit 2017 in San Francisco was once again a great experience. Not only because of the quality of the speakers, but the special atmosphere built up by Spark lovers. To enliven the breaks between sessions talks, we brought a riddle, inviting all attendees to solve it.
When we want to fit a Machine Learning (ML) model to a big dataset, it is often recommended to carefully pre-process the input data in order to obtain better results.
On 12-13 May 2017, the 6th edition of UX Spain took place. If there was a scale enabling us to measure when an event has reached its maturity, UX Spain would be, after this edition, at the top.
This will be the last installment in the “Continuous Delivery in depth” series. After the good and the bad, here comes the ugly. Ugly because of the amount of changes required: a pull request with 308 commits was merged adding 2932 lines, whilst removing a whooping 10112.
Companies have come to realize of late that the real value of their business is data. There has been a rush to create huge Data Lakes to store the enormous amounts of data available inside each company.
We don’t usually like to boast but on this one we can’t hold back. As of 17 February 2017, a huge (but just symbolic) milestone was reached: more than 1000 automated releases performed by our Jenkins installation, from each continuous delivery pipeline.
This is the first part of a story, a story about how important it is to have a reliable release and deployment process.