Boomi this week revealed it has acquired Rivery, a provider of a data management platform that makes it simpler to move data.
Ed Macosky, chief product and technology officer for Boomi, said this addition to the Boomi extends the company’s ability to connect applications to include moving data between them.
Rivery developed a set of extract, load and transform (ELT) tools that include change data capture capabilities that enable updates to, for example, a database to be shared in real time across multiple applications.
The overall goal is to make it simpler to share data across as a set of business processes that are connected via the Boomi integrated Platform as a Service (iPaas) environment, said Macosky.
The addition of Rivery also presents an opportunity for IT organizations to reduce the number of platforms they would otherwise have to integrate themselves, he noted.
The need for that capability will become even more pressing as organizations start to add artificial intelligence (AI) agents to those workflows, added Macosky.
ETL tools trace their lineage to extract, transform and load (ETL) tools often associated with backup and recovery tasks. However, data engineers in the age of the cloud are now using that same core capability to extract and load data before transforming it for use by, for example, an analytics application. In fact, much of the training and customization of artificial intelligence (AI) models requires the expertise of data engineers using ELT tools. Ultimately, the goal is to make sure the right data consistently arrives at the right place at the right time.
Of course, the volume of data that might need to be moved at any given time continues to expand. In addition, there are more data types than ever being incorporated into business workflows. As such, demand for data engineering expertise continues to exceed the available supply.
Hopefully, advances in AI will make it simpler to manage, classify, process and secure data at scale using best DataOps practices. The challenge is that in many organizations there is still a lot of room for improvement when it comes to managing data. Historically, IT teams have been responsible for storing data for a specific application. However, now more applications than ever are accessing the same core pool of data that might be residing, for example, in a data lake.
One way or another, just about every organization will in the months ahead need to improve the way they manage data. Just about everyone recognizes that data is the new proverbial oil, but there isn’t enough capacity to turn all that raw material into actionable insights that an organization trusts.
Whether achieving that goal requires appointing a chief data officer to achieve that goal or simply ensuring that best practices are followed will vary from one organization to the next. The one thing that is for certain is that IT professionals that have data engineering expertise will continue to be worth their weight in gold as more organizations realize their ability to compete depends on the richness of the data they can access.