{"id":6225,"date":"2017-07-27T10:34:31","date_gmt":"2017-07-27T10:34:31","guid":{"rendered":"http:\/\/stratio.com\/?p=6225"},"modified":"2023-09-20T13:29:57","modified_gmt":"2023-09-20T13:29:57","slug":"data-ingestion-kafka-connector","status":"publish","type":"post","link":"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/","title":{"rendered":"One File System to ingest them all (and in Kafka bind them)"},"content":{"rendered":"<p>Sounds epic, doesn\u2019t it? Actually, it\u2019s not that epic!<\/p>\n<p>It could be interesting (or very geeky) to talk about how to ingest data in Middle-earth (and what for). However,\u00a0I guess it would be out of the scope of this blog, so I\u2019m afraid this post has nothing to do with that. This post is about how to ingest data from different kinds of file systems by means of Kafka-Connect using a connector I\u2019ve <i>forged <\/i>recently.<\/p>\n<p><!--more--><\/p>\n<p>From the legacy systems -in terms of hardware and software- for managing huge amounts of data to Big Data systems, a lot has happened; the way we might know an architecture for processing data is pretty different from how we did in the &#8220;old days&#8221;. I can think of some approaches we are using or hearing a lot about such as Microservices, Lambda or Kappa architectures in which we find a wide spectrum of amazing technologies, thanks especially to the open source community. These approaches are focused on how to process data with different sorts of objectives for addressing mostly business needs for data intensive applications.<a href=\"http:\/\/blog.stratio.com\/wp-content\/uploads\/2017\/07\/lord.jpg\"><br \/>\n<\/a><br \/>\n<img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-8104 size-full\" src=\"http:\/\/blog.stratio.com\/wp-content\/uploads\/2017\/07\/lord.jpg\" alt=\"Lord of the rings\" width=\"1920\" height=\"1080\" \/><\/p>\n<h2>How to deal with legacy systems<\/h2>\n<p>Even though new Big Data systems have emerged with a completely different paradigm to the &#8220;legacies&#8221; -solving some issues that organizations were complaining about- and that their popularity is growing, most businesses see the prospect of replacing them as too risky and feel locked into their &#8220;legacy systems&#8221; that are often critical to their day-to-day jobs. \u00a0Dealing with these systems can sometimes be a bit awkward; they impose strong requirements on us to integrate our apps with their ecosystem, breaking some rules we wouldn\u2019t normally like to but, at least for the time being, we have to\u00a0manage this and do our best. For this reason, Big Data techs cannot look away and must face these limitations on a daily basis.<\/p>\n<p>One interesting thing users\/organizations are realizing, is the importance of streaming data and the &#8220;goldmine&#8221; it represents. I\u2019m not going to talk about how important (near) real-time processing is becoming or what a Kappa architecture is.\u00a0I will talk about how data, stored in a \u2018mainframe\u2019 or something similar, could be fed into a streaming platform.<\/p>\n<p>I wouldn\u2019t say it\u2019s rare that when trying to integrate\u00a0your application straight into one of these \u2018legacy systems\u2019, you\u00a0come across someone who tells you: &#8220;No, you can\u2019t access it in that way&#8221;. What now,\u00a0then? One alternative is to export all the data you need to files in a shared file system and process them later. If we\u2019re talking about ingesting this data into a streaming platform, one of the core pieces we use intensively in our Big Data platform is <a href=\"https:\/\/kafka.apache.org\" target=\"_blank\" rel=\"noopener noreferrer\">Apache Kafka<\/a>.<\/p>\n<p>We can consider Kafka as a de-facto platform for streaming architectures. It\u2019s very active, continuously improving and its ecosystem is very rich and growing. One of the components closely related to Kafka is Kafka-Connect: a framework used for interacting from\/to Kafka with external systems. Currently, there are dozens of connectors for Kafka-Connect available which allow us to ingest or bulk data from\/to several kind of systems, but in this post I\u2019m focusing on a connector I\u2019m actually working on: <a href=\"https:\/\/github.com\/mmolimar\/kafka-connect-fs\" target=\"_blank\" rel=\"noopener noreferrer\">kafka-connect-fs<\/a>. So let\u2019s get into the nitty-gritty!<\/p>\n<h2>What is this connector all about?<\/h2>\n<p>In a simple way: it\u2019s a Kafka-Connect source connector (for the moment) for ingesting data from files with different sorts of formats persisted in a file system and loading them into Kafka.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-8100 size-full\" src=\"http:\/\/blog.stratio.com\/wp-content\/uploads\/2017\/07\/shutterstock_85778389_72.jpg\" alt=\"Data\" width=\"1200\" height=\"840\" \/><\/p>\n<h2>\u00a0Why this connector?<\/h2>\n<p>Well, as I\u2019ve told you, there are a lot of connectors out there and some of them even ingest data from files, but the idea behind this one was to provide an abstraction layer to the native file system itself and, thus, allowing you to connect to \u201cany\u201d file system you like.<\/p>\n<p>Thanks to <i>hadoop-common<\/i> lib, we have this abstraction layer using the following abstract class\u00a0\u2018org.apache.hadoop.fs.FileSystem\u2019\u00a0which includes a generic file system implementation, and allows\u00a0custom implementation. Out of the box, there are several file system types you can use, but in case your file system is not included, you can develop your own one! Some of them you can use are:<\/p>\n<ul>\n<li>HDFS.<\/li>\n<li>WebHDFS.<\/li>\n<li>S3.<\/li>\n<li>FTP.<\/li>\n<li>Local File System.<\/li>\n<li>Hadoop Archive File System.<\/li>\n<\/ul>\n<p>For sure, there should be more custom implementations of these file systems but these ones should cover most of our use cases.<\/p>\n<p>By the way, you can find the list of connectors <a href=\"https:\/\/www.confluent.io\/product\/connectors\/\" target=\"_blank\" rel=\"noopener noreferrer\">here<\/a>.<\/p>\n<h2>Features<\/h2>\n<p>There are two main concepts within the connector you have to take into account. They are the following:<\/p>\n<ul>\n<li>Policies: they define how you\u2019re going to poll data from the file system. For instance, continuously, from time to time, file-watcher, etc.<\/li>\n<li>File readers: the format you will need to use depending on the source file data format you have in the file system. By now, the connector supports plain text (delimited or not), Avro, Parquet and Sequence Files.<\/li>\n<\/ul>\n<p>Policies and readers have their own custom configurations and there are even more config options you can find in the <a href=\"http:\/\/kafka-connect-fs.readthedocs.io\/\" target=\"_blank\" rel=\"noopener noreferrer\">documentation site<\/a>; you can find some tips, an FAQ section and lots more.<\/p>\n<h2>How this works<\/h2>\n<p>When starting the connector, the URIs included in the connector config are grouped based on the number of tasks defined in this config. After that, the tasks are initialized and then start polling data by means of the defined policy.<\/p>\n<p>This policy handles the connection with the FS, retrieves files based on its own configuration and provides the reader for processing files with their corresponding records. Also, the policy carries out the offset management so, if the same file is processed again, it will seek the file from the <i>last committed offset<\/i> (which does not mean the current record matches the last one processed necessarily) to avoid processing the same records over and over again.<\/p>\n<p>Once this is done, the reader delivers the corresponding records to the working tasks and copies them to Kafka and&#8230; that\u2019s it!<\/p>\n<p>Notice that you won\u2019t get <i>exactly-once <\/i>semantics with source connectors as Kafka doesn\u2019t support it yet but you will get\u00a0<i>at-least-once<\/i> and <i>at-most-once <\/i>semantics. However, this will be presumably supported after this <a href=\"https:\/\/cwiki.apache.org\/confluence\/display\/KAFKA\/KIP-98+-+Exactly+Once+Delivery+and+Transactional+Messaging\" target=\"_blank\" rel=\"noopener noreferrer\">KIP<\/a>. In fact, <i>exactly-once<\/i> semantics is already supported in Kafka-Streams after merging this <a href=\"https:\/\/github.com\/apache\/kafka\/pull\/2945\" target=\"_blank\" rel=\"noopener noreferrer\">PR<\/a> (this feature is already included in Kafka 0.11.0.0 version).<\/p>\n<h2>Running the connector<\/h2>\n<p>If you want to try the connector, you just have to download it from the repo and then compile, package and put it into the Kafka-Connect classpath.<\/p>\n<p>To try the connector you can just deploy Kafka-Connect in standalone mode. To do this, the properties file indicating the connector config are required; this file would look like this:<\/p>\n<pre class=\"\">name=FsSourceConnector\nconnector.class=com.github.mmolimar.kafka.connect.fs.FsSourceConnector\ntasks.max=1\nfs.uris=file:\/\/\/data,hdfs:\/\/localhost:9000\/\ntopic=mytopic\npolicy.class=com.github.mmolimar.kafka.connect.fs.policy.SimplePolicy\npolicy.recursive=true\npolicy.regexp=^[0-9]*\\.txt$\nfile_reader.class=com.github.mmolimar.kafka.connect.fs.file.reader.TextFileReader<\/pre>\n<p>And execute this command:<\/p>\n<pre class=\"\">bin\/connect-standalone etc\/kafka\/connect-standalone.properties\netc\/kafka-connect-fs\/kafka-connect-fs.properties<\/pre>\n<h2>Future work<\/h2>\n<p>In this post we\u2019ve seen how to ingest data from a wide variety of file systems and copy them into Kafka using the <i>kafka-connect-fs<\/i> connector. There are more configuration options you can use to adapt the connector to your particular use case and there are a bunch of features I\u2019d like to include in the next versions (new file readers, policies, sink connector and more). Coming soon&#8230;<\/p>\n<p>In case the current version doesn\u2019t fit your needs, you\u2019re free to implement new features\u00a0and very welcome to contribute to the project!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>This post is about how to ingest data from different kinds of file systems by means of Kafka-Connect using a connector I\u2019ve forged recently.<\/p>\n","protected":false},"author":1,"featured_media":6179,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[686],"tags":[19],"ppma_author":[795],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v22.9 (Yoast SEO v22.9) - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>One File System to ingest them all (and in Kafka-Connect bind them)<\/title>\n<meta name=\"description\" content=\"Learn how to ingest data from different kinds of file systems by means of Kafka-Connect -a framework used for interacting from\/to Kafka- using a connector.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"One File System to ingest them all (and in Kafka bind them)\" \/>\n<meta property=\"og:description\" content=\"Learn how to ingest data from different kinds of file systems by means of Kafka-Connect -a framework used for interacting from\/to Kafka- using a connector.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/\" \/>\n<meta property=\"og:site_name\" content=\"Stratio\" \/>\n<meta property=\"article:published_time\" content=\"2017-07-27T10:34:31+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2023-09-20T13:29:57+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.stratio.com\/blog\/wp-content\/uploads\/2017\/07\/Kafka-connectors.png\" \/>\n\t<meta property=\"og:image:width\" content=\"730\" \/>\n\t<meta property=\"og:image:height\" content=\"312\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Stratio\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@stratiobd\" \/>\n<meta name=\"twitter:site\" content=\"@stratiobd\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Stratio\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"6 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/\"},\"author\":{\"name\":\"Stratio\",\"@id\":\"https:\/\/www.stratio.com\/blog\/#\/schema\/person\/d0377b199cd052b17e15c9ba44c45ab7\"},\"headline\":\"One File System to ingest them all (and in Kafka bind them)\",\"datePublished\":\"2017-07-27T10:34:31+00:00\",\"dateModified\":\"2023-09-20T13:29:57+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/\"},\"wordCount\":1238,\"publisher\":{\"@id\":\"https:\/\/www.stratio.com\/blog\/#organization\"},\"image\":{\"@id\":\"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.stratio.com\/blog\/wp-content\/uploads\/2017\/07\/Kafka-connectors.png\",\"keywords\":[\"Big Data\"],\"articleSection\":[\"Product\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/\",\"url\":\"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/\",\"name\":\"One File System to ingest them all (and in Kafka-Connect bind them)\",\"isPartOf\":{\"@id\":\"https:\/\/www.stratio.com\/blog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.stratio.com\/blog\/wp-content\/uploads\/2017\/07\/Kafka-connectors.png\",\"datePublished\":\"2017-07-27T10:34:31+00:00\",\"dateModified\":\"2023-09-20T13:29:57+00:00\",\"description\":\"Learn how to ingest data from different kinds of file systems by means of Kafka-Connect -a framework used for interacting from\/to Kafka- using a connector.\",\"breadcrumb\":{\"@id\":\"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/#primaryimage\",\"url\":\"https:\/\/www.stratio.com\/blog\/wp-content\/uploads\/2017\/07\/Kafka-connectors.png\",\"contentUrl\":\"https:\/\/www.stratio.com\/blog\/wp-content\/uploads\/2017\/07\/Kafka-connectors.png\",\"width\":730,\"height\":312},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.stratio.com\/blog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"One File System to ingest them all (and in Kafka bind them)\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.stratio.com\/blog\/#website\",\"url\":\"https:\/\/www.stratio.com\/blog\/\",\"name\":\"Stratio Blog\",\"description\":\"Corporate blog\",\"publisher\":{\"@id\":\"https:\/\/www.stratio.com\/blog\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.stratio.com\/blog\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.stratio.com\/blog\/#organization\",\"name\":\"Stratio\",\"url\":\"https:\/\/www.stratio.com\/blog\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.stratio.com\/blog\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/stratio.com\/blog\/wp-content\/uploads\/2020\/06\/stratio-web-logo-1.png\",\"contentUrl\":\"https:\/\/stratio.com\/blog\/wp-content\/uploads\/2020\/06\/stratio-web-logo-1.png\",\"width\":260,\"height\":55,\"caption\":\"Stratio\"},\"image\":{\"@id\":\"https:\/\/www.stratio.com\/blog\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/stratiobd\",\"https:\/\/es.linkedin.com\/company\/stratiobd\",\"https:\/\/www.youtube.com\/c\/StratioBD\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.stratio.com\/blog\/#\/schema\/person\/d0377b199cd052b17e15c9ba44c45ab7\",\"name\":\"Stratio\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.stratio.com\/blog\/#\/schema\/person\/image\/bb38888f58c2bb664646155f78ae6ccc\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/e3387ad00609f34a56d6796400eb8191?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/e3387ad00609f34a56d6796400eb8191?s=96&d=mm&r=g\",\"caption\":\"Stratio\"},\"description\":\"Stratio guides businesses on their journey through complete #DigitalTransformation with #BigData and #AI. Stratio works worldwide for large companies and multinationals in the sectors of banking, insurance, healthcare, telco, retail, energy and media.\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"One File System to ingest them all (and in Kafka-Connect bind them)","description":"Learn how to ingest data from different kinds of file systems by means of Kafka-Connect -a framework used for interacting from\/to Kafka- using a connector.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/","og_locale":"en_US","og_type":"article","og_title":"One File System to ingest them all (and in Kafka bind them)","og_description":"Learn how to ingest data from different kinds of file systems by means of Kafka-Connect -a framework used for interacting from\/to Kafka- using a connector.","og_url":"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/","og_site_name":"Stratio","article_published_time":"2017-07-27T10:34:31+00:00","article_modified_time":"2023-09-20T13:29:57+00:00","og_image":[{"width":730,"height":312,"url":"https:\/\/www.stratio.com\/blog\/wp-content\/uploads\/2017\/07\/Kafka-connectors.png","type":"image\/png"}],"author":"Stratio","twitter_card":"summary_large_image","twitter_creator":"@stratiobd","twitter_site":"@stratiobd","twitter_misc":{"Written by":"Stratio","Est. reading time":"6 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/#article","isPartOf":{"@id":"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/"},"author":{"name":"Stratio","@id":"https:\/\/www.stratio.com\/blog\/#\/schema\/person\/d0377b199cd052b17e15c9ba44c45ab7"},"headline":"One File System to ingest them all (and in Kafka bind them)","datePublished":"2017-07-27T10:34:31+00:00","dateModified":"2023-09-20T13:29:57+00:00","mainEntityOfPage":{"@id":"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/"},"wordCount":1238,"publisher":{"@id":"https:\/\/www.stratio.com\/blog\/#organization"},"image":{"@id":"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/#primaryimage"},"thumbnailUrl":"https:\/\/www.stratio.com\/blog\/wp-content\/uploads\/2017\/07\/Kafka-connectors.png","keywords":["Big Data"],"articleSection":["Product"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/","url":"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/","name":"One File System to ingest them all (and in Kafka-Connect bind them)","isPartOf":{"@id":"https:\/\/www.stratio.com\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/#primaryimage"},"image":{"@id":"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/#primaryimage"},"thumbnailUrl":"https:\/\/www.stratio.com\/blog\/wp-content\/uploads\/2017\/07\/Kafka-connectors.png","datePublished":"2017-07-27T10:34:31+00:00","dateModified":"2023-09-20T13:29:57+00:00","description":"Learn how to ingest data from different kinds of file systems by means of Kafka-Connect -a framework used for interacting from\/to Kafka- using a connector.","breadcrumb":{"@id":"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/#primaryimage","url":"https:\/\/www.stratio.com\/blog\/wp-content\/uploads\/2017\/07\/Kafka-connectors.png","contentUrl":"https:\/\/www.stratio.com\/blog\/wp-content\/uploads\/2017\/07\/Kafka-connectors.png","width":730,"height":312},{"@type":"BreadcrumbList","@id":"https:\/\/www.stratio.com\/blog\/data-ingestion-kafka-connector\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.stratio.com\/blog\/"},{"@type":"ListItem","position":2,"name":"One File System to ingest them all (and in Kafka bind them)"}]},{"@type":"WebSite","@id":"https:\/\/www.stratio.com\/blog\/#website","url":"https:\/\/www.stratio.com\/blog\/","name":"Stratio Blog","description":"Corporate blog","publisher":{"@id":"https:\/\/www.stratio.com\/blog\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.stratio.com\/blog\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.stratio.com\/blog\/#organization","name":"Stratio","url":"https:\/\/www.stratio.com\/blog\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.stratio.com\/blog\/#\/schema\/logo\/image\/","url":"https:\/\/stratio.com\/blog\/wp-content\/uploads\/2020\/06\/stratio-web-logo-1.png","contentUrl":"https:\/\/stratio.com\/blog\/wp-content\/uploads\/2020\/06\/stratio-web-logo-1.png","width":260,"height":55,"caption":"Stratio"},"image":{"@id":"https:\/\/www.stratio.com\/blog\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/stratiobd","https:\/\/es.linkedin.com\/company\/stratiobd","https:\/\/www.youtube.com\/c\/StratioBD"]},{"@type":"Person","@id":"https:\/\/www.stratio.com\/blog\/#\/schema\/person\/d0377b199cd052b17e15c9ba44c45ab7","name":"Stratio","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.stratio.com\/blog\/#\/schema\/person\/image\/bb38888f58c2bb664646155f78ae6ccc","url":"https:\/\/secure.gravatar.com\/avatar\/e3387ad00609f34a56d6796400eb8191?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/e3387ad00609f34a56d6796400eb8191?s=96&d=mm&r=g","caption":"Stratio"},"description":"Stratio guides businesses on their journey through complete #DigitalTransformation with #BigData and #AI. Stratio works worldwide for large companies and multinationals in the sectors of banking, insurance, healthcare, telco, retail, energy and media."}]}},"authors":[{"term_id":795,"user_id":1,"is_guest":0,"slug":"stratioadmin","display_name":"Stratio","avatar_url":"https:\/\/secure.gravatar.com\/avatar\/e3387ad00609f34a56d6796400eb8191?s=96&d=mm&r=g","0":null,"1":"","2":"","3":"","4":"","5":"","6":"","7":"","8":""}],"amp_enabled":true,"_links":{"self":[{"href":"https:\/\/www.stratio.com\/blog\/wp-json\/wp\/v2\/posts\/6225"}],"collection":[{"href":"https:\/\/www.stratio.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.stratio.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.stratio.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.stratio.com\/blog\/wp-json\/wp\/v2\/comments?post=6225"}],"version-history":[{"count":11,"href":"https:\/\/www.stratio.com\/blog\/wp-json\/wp\/v2\/posts\/6225\/revisions"}],"predecessor-version":[{"id":13585,"href":"https:\/\/www.stratio.com\/blog\/wp-json\/wp\/v2\/posts\/6225\/revisions\/13585"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.stratio.com\/blog\/wp-json\/wp\/v2\/media\/6179"}],"wp:attachment":[{"href":"https:\/\/www.stratio.com\/blog\/wp-json\/wp\/v2\/media?parent=6225"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.stratio.com\/blog\/wp-json\/wp\/v2\/categories?post=6225"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.stratio.com\/blog\/wp-json\/wp\/v2\/tags?post=6225"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/www.stratio.com\/blog\/wp-json\/wp\/v2\/ppma_author?post=6225"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}