Best IRI Data Manager Alternatives in 2025
Find the top alternatives to IRI Data Manager currently available. Compare ratings, reviews, pricing, and features of IRI Data Manager alternatives in 2025. Slashdot lists the best IRI Data Manager alternatives on the market that offer competing products that are similar to IRI Data Manager. Sort through IRI Data Manager alternatives below to make the best choice for your needs
-
1
DATPROF
DATPROF
Mask, generate, subset, virtualize, and automate your test data with the DATPROF Test Data Management Suite. Our solution helps managing Personally Identifiable Information and/or too large databases. Long waiting times for test data refreshes are a thing of the past. -
2
A powerful iPaaS platform for integration and business process automation. Linx is a powerful integration platform (iPaaS) that enables organizations to connect all their data sources, systems, and applications. The platform is known for its programming-like flexibility and the resulting ability to handle complex integrations at scale. It is a popular choice for growing businesses looking to embrace a unified integration strategy.
-
3
IRI FieldShield
IRI, The CoSort Company
Varies by component/scope IRI FieldShield® is a powerful and affordable data discovery and de-identification package for masking PII, PHI, PAN and other sensitive data in structured and semi-structured sources. Front-ended in a free Eclipse-based design environment, FieldShield jobs classify, profile, scan, and de-identify data at rest (static masking). Use the FieldShield SDK or proxy-based application to secure data in motion (dynamic data masking). The usual method for masking RDB and other flat files (CSV, Excel, LDIF, COBOL, etc.) is to classify it centrally, search for it globally, and automatically mask it in a consistent way using encryption, pseudonymization, redaction or other functions to preserve realism and referential integrity in production or test environments. Use FieldShield to make test data, nullify breaches, or comply with GDPR. HIPAA. PCI, PDPA, PCI-DSS and other laws. Audit through machine- and human-readable search reports, job logs and re-ID risks scores. Optionally mask data when you map it; FieldShield functions can also run in IRI Voracity ETL and federation, migration, replication, subsetting, and analytic jobs. To mask DB clones run FieldShield in Windocks, Actifio or Commvault. Call it from CI/CD pipelines and apps. -
4
IRI Voracity
IRI, The CoSort Company
IRI Voracity is an end-to-end software platform for fast, affordable, and ergonomic data lifecycle management. Voracity speeds, consolidates, and often combines the key activities of data discovery, integration, migration, governance, and analytics in a single pane of glass, built on Eclipse™. Through its revolutionary convergence of capability and its wide range of job design and runtime options, Voracity bends the multi-tool cost, difficulty, and risk curves away from megavendor ETL packages, disjointed Apache projects, and specialized software. Voracity uniquely delivers the ability to perform data: * profiling and classification * searching and risk-scoring * integration and federation * migration and replication * cleansing and enrichment * validation and unification * masking and encryption * reporting and wrangling * subsetting and testing Voracity runs on-premise, or in the cloud, on physical or virtual machines, and its runtimes can also be containerized or called from real-time applications or batch jobs. -
5
IRI Fast Extract (FACT)
IRI, The CoSort Company
A fast extract step can be a critical component of: database archive and replication database reorgs and migrations data warehouse ETL, ELT, and ODS operations offline reporting and bulk data protection IRI Fast Extract (FACT™) is a parallel unload utility for very large database (VLDB) tables in: Oracle DB2 UDB MS SQL Server Sybase MySQL Greenplum Teradata Altibase Tibero FACT uses simple job scripts (supported in a familiar Eclipse GUI) to rapidly create portable flat files. FACT's speed comes from native connection protocols and proprietary split query logic that unloads billions of rows in minutes. Although FACT is a standalone, application-independent utility, it can also work nicely with other programs and platforms. For example, FACT optionally creates metadata for data definition files (.DDF) that IRI CoSort and its compatible data management and protection tools can use to manipulate the flat files. FACT also automatically creates database load utility configuration files for the same source. FACT is also an optional, seamlessly integrated component in the IRI Voracity ETL and data management platform. The automatic metadata creation -- and coexistence of other IRI software in the same IDE -- -
6
IRI NextForm
IRI, The CoSort Company
USD 3000IRI NextForm is powerful, user-friendly Windows and Unix data mgiration software for data, file, and database: * profiling * conversion * replication * restructuring * federation * reporting NextForm inherits many of the SortCL program functions available in IRI CoSort and uses the IRI Workbench GUI, built on Eclipse.™ The same high-performance data movement engine that maps between multiple sources and targets also make NextForm a compelling, and affordable, place to begin managing big data without the need for Hadoop. -
7
Sesame Software
Sesame Software
When you have the expertise of an enterprise partner combined with a scalable, easy-to-use data management suite, you can take back control of your data, access it from anywhere, ensure security and compliance, and unlock its power to grow your business. Why Use Sesame Software? Relational Junction builds, populates, and incrementally refreshes your data automatically. Enhance Data Quality - Convert data from multiple sources into a consistent format – leading to more accurate data, which provides the basis for solid decisions. Gain Insights - Automate the update of information into a central location, you can use your in-house BI tools to build useful reports to avoid costly mistakes. Fixed Price - Avoid high consumption costs with yearly fixed prices and multi-year discounts no matter your data volume. -
8
Stelo
Stelo
$30,000 annualStelo is an enterprise-class tool which dynamically delivers data anywhere to anywhere. It can be used for analysis, prediction, and reporting, as well as for managing business operations and supply chains. You can easily move data between your core relational databases or delta lakes across firewalls, to other people, or to the Cloud. Stelo Data Replicator offers reliable, fast, and affordable replication for any relational or non-relational database via ODBC, Kafka, Delta Lakes, and flat file formats. Stelo uses native data loading functions and multithreaded processing for fast, reliable performance when replicating multiple tables simultaneously. Easy installation using GUI interfaces, configuration wizards and advanced tools makes product setup and operation simple, without the need for programming. Stelo runs in the background and requires no engineering support. -
9
StorCentric Data Mobility Suite
StorCentric
StorCentric Data Mobility Suite, an all-inclusive solution, allows organizations to seamlessly move data wherever it is needed. DMS is a cloud-enabled software solution that supports data migration, replication, and data sync across mixed environments, including disk, tape and cloud. This allows organizations to maximize ROI and eliminate data silos. DMS is vendor-agnostic and supports file replication and synchronization. It can be easily deployed on a non-proprietary server and managed. DMS can transfer millions upon millions of files simultaneously, and it protects data while in transit to and fro the cloud with SSL encryption. DMS simplifies point-to-point data movements and addresses data flow requirements from one storage platform to the next. Fine-grained filtering, continuous incremental updates, and continuous filters make it easier to consolidate and move data across heterogeneous environments. DMS allows files to be synchronized across multiple storage repositories including disk and tape. -
10
Qlik Replicate
Qlik
Qlik Replicate offers high-performance data replication, optimized data ingestion and seamless integration into all major big data analytics platforms. Replicate supports both bulk replication and real-time incremental replicating using CDC (changed data capture). Our unique zero footprint architecture eliminates unnecessary overhead for your mission-critical systems, and facilitates data migrations and upgrades with zero downtime. Database replication allows you to consolidate or move data from a production data base to a newer database version, another computing environment, or a different database management system. For example, you can migrate data from SQL server to Oracle. Data replication is a way to move production data out of a database and into operational data stores, data warehouses or other data storage systems for analytics or reporting. -
11
QuerySurge
RTTS
7 RatingsQuerySurge is the smart Data Testing solution that automates the data validation and ETL testing of Big Data, Data Warehouses, Business Intelligence Reports and Enterprise Applications with full DevOps functionality for continuous testing. Use Cases - Data Warehouse & ETL Testing - Big Data (Hadoop & NoSQL) Testing - DevOps for Data / Continuous Testing - Data Migration Testing - BI Report Testing - Enterprise Application/ERP Testing Features Supported Technologies - 200+ data stores are supported QuerySurge Projects - multi-project support Data Analytics Dashboard - provides insight into your data Query Wizard - no programming required Design Library - take total control of your custom test desig BI Tester - automated business report testing Scheduling - run now, periodically or at a set time Run Dashboard - analyze test runs in real-time Reports - 100s of reports API - full RESTful API DevOps for Data - integrates into your CI/CD pipeline Test Management Integration QuerySurge will help you: - Continuously detect data issues in the delivery pipeline - Dramatically increase data validation coverage - Leverage analytics to optimize your critical data - Improve your data quality at speed -
12
GS RichCopy 360 Enterprise
GuruSquad
$129 one-time payment 126 RatingsGS RichCopy 360, a data migration software for enterprises, is available. It copies your data (files, folders) to another place. Multi-threading technology allows files to be copied simultaneously. It offers the following premium features: Copy to Office 365 OneDrive or SharePoint Copy open files Copy NTFS permissions - Support Long Path Name - Run as a service and according to a scheduler (you don't need to log in). - Copy folder and file attributes as well as time stamps. - When it is complete, send an email. Support by phone and email - Simple to use - Copy data using one TCP port across the internet, and have it encrypted as it is being transferred. - Bite level replication (copy only deltas in the file, not the entire file). Superior and robust performance. - Supports Windows 7 or later (Windows 8, Windows8.1, Windows 10). - Supports Windows Server 2008R2 or Later (Windows Server 2012 R2, 2016, and 2019). -
13
IBM®, InfoSphere®, Data Replication allows log-based data capture with transactional integrity to support bigdata integration, consolidation, warehousing, and analytics initiatives at scale. It allows you to replicate data between heterogeneous sources or targets. It supports data upgrades and migrations with zero downtime. IBM InfoSphere Data Replication also provides continuous availability to maintain replicas of databases in remote locations. This allows you to switch a workload to these replicas in seconds instead of hours. To get a first look at the new cloud-tocloud and on-premises-tocloud data replication capabilities, join the beta program. Find out what makes you a good candidate for the beta program, and what you can expect. Register for the IBM Data Replication beta program to get limited access and work with us on the new product direction.
-
14
IRI RowGen
IRI, The CoSort Company
8000 on first hostnameIRI RowGen generates rows ... billions of rows of safe, intelligent test data in database, flat-file, and formatted report targets ... using metadata, not data. RowGen synthesizes and populates accurate, relational test data with the same characteristics of production data. RowGen uses the metadata you already have (or create on the fly) to randomly generate structurally and referentially correct test data, and/or randomly select data from real sets. RowGen lets you customize data formats, volumes, ranges, distributions, and other properties on the fly or with re-usable rules that support major goals like application testing and subsetting. RowGen uses the IRI CoSort engine to deliver the fastest generation, transformation, and bulk-load movement of big test data on the market. RowGen was designed by data modeling, integration, and processing experts to save time and energy in the creation of perfect, compliant test sets in production and custom formats. With RowGen, you can produce and provision safe, smart, synthetic test data for: DevOps, DB, DV, and DW prototypes, demonstrations, application stress-testing, and benchmarking -- all without needing production data. -
15
IRI CoSort
IRI, The CoSort Company
From $4K USD perpetual useFor more four decades, IRI CoSort has defined the state-of-the-art in big data sorting and transformation technology. From advanced algorithms to automatic memory management, and from multi-core exploitation to I/O optimization, there is no more proven performer for production data processing than CoSort. CoSort was the first commercial sort package developed for open systems: CP/M in 1980, MS-DOS in 1982, Unix in 1985, and Windows in 1995. Repeatedly reported to be the fastest commercial-grade sort product for Unix. CoSort was also judged by PC Week to be the "top performing" sort on Windows. CoSort was released for CP/M in 1978, DOS in 1980, Unix in the mid-eighties, and Windows in the early nineties, and received a readership award from DM Review magazine in 2000. CoSort was first designed as a file sorting utility, and added interfaces to replace or convert sort program parameters used in IBM DataStage, Informatica, MF COBOL, JCL, NATURAL, SAS, and SyncSort. In 1992, CoSort added related manipulation functions through a control language interface based on VMS sort utility syntax, which evolved through the years to handle structured data integration and staging for flat files and RDBs, and multiple spinoff products. -
16
Arcion
Arcion Labs
$2,894.76 per monthYou can deploy production-ready change data capture pipes for high-volume, real time data replication without writing a single line code. Supercharged Change Data Capture. Arcion's distributed Change Data Capture, CDC, allows for automatic schema conversion, flexible deployment, end-to-end replication and much more. Arcion's zero-data loss architecture ensures end-to-end consistency and built-in checkpointing. You can forget about performance and scalability concerns with a distributed, highly parallel architecture that supports 10x faster data replication. Arcion Cloud is the only fully managed CDC offering. You'll enjoy autoscaling, high availability, monitoring console and more. Reduce downtime and simplify data pipelines architecture. -
17
Flatfile
Flatfile
Flatfile is an advanced data exchange platform that simplifies the process of importing, cleaning, transforming, and managing data for businesses. It provides a robust suite of APIs, allowing seamless integration into existing systems for efficient file-based data workflows. With an intuitive interface, the platform supports easy data management through features like search, sorting, and automated transformations. Built with strict compliance to SOC 2, HIPAA, and GDPR standards, Flatfile ensures data security and privacy while leveraging a scalable cloud infrastructure. By reducing manual effort and improving data quality, Flatfile accelerates data onboarding and supports businesses in achieving better operational efficiency. -
18
Precisely Connect
Precisely
Integrate legacy systems seamlessly into the next-gen cloud or data platforms with one solution. Connect allows you to take control of your data, from mainframe to cloud. Integrate data via batch and real-time input for advanced analytics, comprehensive machinelearning and seamless data migration. Connect draws on the decades of experience Precisely has gained as a leader in mainframe sorting and IBM i data availability security. This allows the company to be a leader in the field of complex data access and integration. Access to all enterprise data is possible for critical business projects. Connect supports a wide range targets and sources for all your ELT/CDC needs. -
19
ibi Data Migrator
Cloud Software Group
ibi Data Migrator, a comprehensive ETL tool (Extract Transform Load), is designed to streamline data integration on diverse platforms, including on-premises and cloud environments. It allows for the automation of data mart and data warehouse creation. Source data can be accessed in a variety of formats and operating systems. The platform integrates data from multiple sources into a single or multiple targets. It applies robust data cleansing rules to ensure data quality. Users can schedule data updates based on events or conditional dependencies, and trigger them by user-defined intervals. The system can load star schemas that have slowly changing dimensions. It also offers extensive transaction statistics and extensive logging to provide a deeper insight into data operations. The data management console is a graphical user interface that allows the design, testing and execution of data flows and processes. -
20
iCEDQ
Torana
iCEDQ, a DataOps platform that allows monitoring and testing, is a DataOps platform. iCEDQ is an agile rules engine that automates ETL Testing, Data Migration Testing and Big Data Testing. It increases productivity and reduces project timelines for testing data warehouses and ETL projects. Identify data problems in your Data Warehouse, Big Data, and Data Migration Projects. The iCEDQ platform can transform your ETL or Data Warehouse Testing landscape. It automates it from end to end, allowing the user to focus on analyzing the issues and fixing them. The first edition of iCEDQ was designed to validate and test any volume of data with our in-memory engine. It can perform complex validation using SQL and Groovy. It is optimized for Data Warehouse Testing. It scales based upon the number of cores on a server and is 5X faster that the standard edition. -
21
DataLark
LeverX
$24,000/year DataLark is an SAP-centric data management platform that helps enterprises migrate, maintain, and integrate business-critical data on-premise and in the cloud more quickly, securely, and cost-effectively using its extensible plugins and connectivities. The DataLark platform works across a wide range of industries and types of enterprise data. Solutions: -Data Management -ERP -Data Validation and Profiling -Data Integration -
22
AWS Database Migration Service allows you to migrate databases to AWS quickly, securely, and securely. The migration process does not affect the source database's functionality. This minimizes downtime for applications that depend on the database. The AWS Database Migration Service is able to migrate your data from and to most commonly used commercial and open-source database systems.
-
23
Equalum
Equalum
Equalum's continuous data integr & streaming platform is unique in that it natively supports real time, batch, and ETL use case under one platform. There is no coding required. You can move to real time with a fully orchestrated, drag and drop, no-code UI. You will experience rapid deployment, powerful transformations and scalable streaming data pipes in minutes. Multi-modal, robust and scalable CDC enables real-time streaming and data replicating. No matter what source, the CDC is tuned for best-in class performance. The power of open-source big dataset frameworks without the hassle. Equalum leverages the Scalability of Open-Source Data Frameworks like Apache Spark and Kafka in its Platform engine to dramatically improve streaming and batch data processing performance. This best-in-class infrastructure allows organizations to increase data volumes, improve performance, and minimize system impact. -
24
Lyftrondata
Lyftrondata
Lyftrondata can help you build a governed lake, data warehouse or migrate from your old database to a modern cloud-based data warehouse. Lyftrondata makes it easy to create and manage all your data workloads from one platform. This includes automatically building your warehouse and pipeline. It's easy to share the data with ANSI SQL, BI/ML and analyze it instantly. You can increase the productivity of your data professionals while reducing your time to value. All data sets can be defined, categorized, and found in one place. These data sets can be shared with experts without coding and used to drive data-driven insights. This data sharing capability is ideal for companies who want to store their data once and share it with others. You can define a dataset, apply SQL transformations, or simply migrate your SQL data processing logic into any cloud data warehouse. -
25
dataZap
ChainSys
Cloud to cloud, on-premises to cloud data cleansing and migration, integration, and reconciliation. Oracle Enterprise Applications Cloud and On Premises can be securely accessed via OCI. One platform for data migrations & setups, integrations and reconciliations, as well as big data ingestion & archiving Pre-built API templates for over 9000 web services and APIs Data quality engine is pre-configured with business rules to clean, enrich, and correct data Low code/no code, configurable, agile and flexible Cloud-based software that is ready to use immediately Platform for migrating data from legacy applications and any of the above systems into Oracle Cloud Applications. It is a robust, scalable Data Migration Platform with an intuitive interface. More than 3000+ Smart Data Adapters covering Oracle Applications are available. -
26
Agile DevOps teams today need to be able to move faster. BMC Compuware File AID is a cross-platform data and file management solution that allows developers and QA staff to quickly access the data and files they need, rather than searching for them. Developers spend less time on data-related tasks and more time developing new functionality or managing production issues. You can make code changes with confidence by rightsizing your test data without worrying about unintended consequences. File-AID allows developers to: Effectively manage files across platforms All file types are available, regardless of format or record length, for application integration To simplify the validation of test results, compare data files and objects You can easily reformat files instead of starting from scratch by simply changing an existing file format. Extract and load related data from multiple files and databases More
-
27
StarfishETL
StarfishETL
400/month StarfishETL is a Cloud iPaaS solution, which gives it the unique ability to connect virtually any kind of solution to any other kind of solution as long as both of those applications have an API. This gives StarfishETL customers ultimate control over their data projects, with the ability to build more unique and scalable data connections. -
28
Datametica
Datametica
Datametica's birds have unmatched capabilities, which help to eliminate business risks, time, frustration, anxiety, and cost from the entire process for data warehouse migration to cloud. Datametica's automated product suite allows you to migrate existing data warehouses, data lakes, ETL, Enterprise business intelligence, and other data to the cloud environment of choice. Designing an end to end migration strategy that includes workload discovery, assessment and planning. From the discovery and assessment of your data warehouse to the planning of the migration strategy, Eagle provides clarity on what needs to be migrated, in what order, how to streamline the process, and what the costs and timelines are. The integrated view of the workloads and planning minimizes migration risk without affecting the business. -
29
Qlik Gold Client
Qlik
Qlik Gold Client increases the efficiency, security, and cost of managing SAP test data. Qlik Gold Client eliminates development workarounds by allowing you to easily move configuration, master, or transactional data subsets into test environments. Rapidly create, copy and synchronize transactional information from production to nonproduction targets. Non-production data can be identified, selected, and deleted. A simple interface allows you to manage complex and powerful data transformations. Automate data selection and allow hands-free refresh cycles of test data, reducing the time and effort required for test data management. Qlik Gold Client offers several options to protect PII information in non-production environments through data masking. Data masking is a set of rules that "scrambles" your production data when it is replicated to non-production environments. -
30
In a developer-friendly visual editor, you can design, debug, run, and troubleshoot data jobflows and data transformations. You can orchestrate data tasks that require a specific sequence and organize multiple systems using the transparency of visual workflows. Easy deployment of data workloads into an enterprise runtime environment. Cloud or on-premise. Data can be made available to applications, people, and storage through a single platform. You can manage all your data workloads and related processes from one platform. No task is too difficult. CloverDX was built on years of experience in large enterprise projects. Open architecture that is user-friendly and flexible allows you to package and hide complexity for developers. You can manage the entire lifecycle for a data pipeline, from design, deployment, evolution, and testing. Our in-house customer success teams will help you get things done quickly.
-
31
MOSTLY AI
MOSTLY AI
We can no longer rely upon real-life conversations as physical customer interactions shift to digital. Customers communicate their intentions and share their needs using data. Data is a key tool for understanding customers and testing our assumptions. Privacy regulations like GDPR and CCPA make deep understanding more difficult. This gap in customer understanding is bridged by the MOSTLY AI synthetic dataset platform. Businesses can benefit from a reliable, high-quality generator of synthetic data in many different applications. The story doesn't end there. MOSTLY AI's synthetic dataset platform is more versatile than any other synthetic data generator. MOSTLY AI's versatility makes it an indispensable tool for software development and testing. From AI training to explainability and bias mitigation, governance to realistic test data, with subsetting, referential integrity. -
32
Informatica Test Data Management
Informatica
We can help you find, create, and subset data for test data; visualize test coverage; and protect data so that you can concentrate on development. Automate provisioning synthetically, subsetted, or masked data for development and testing purposes. Consistent masking across databases and within databases allows you to quickly identify sensitive data locations. To improve testers' efficiency, store, augment, share, or reuse test datasets. To reduce infrastructure requirements and increase performance, provision smaller data sets. Our comprehensive collection of masking techniques can be used to protect data across applications. To ensure solution integrity and speed deployments, support packaged applications. To align with data governance initiatives, engage risk, compliance, audit and other teams. Test efficiency can be improved with reliable, trusted production data sets. Server and storage footprints can be reduced with data sets that are targeted for each team. -
33
GS RichCopy 360 Standard data migration software is enterprise-grade. It copies your data (files, folders) to another place. Multi-threading technology allows files to be copied simultaneously. It offers the following premium features: - Copy files to Office 365 OneDrive or SharePoint - Copy open files. - Copy NTFS permissions. - Support Long Path Name - Run as a service and according to a scheduler (you don't need to log in). - Copy folder and file attributes as well as time stamps. Send an email to confirm. Support by phone and email - Simple to use
-
34
PoINT Data Replicator
PoINT Software & Systems
Organizations store unstructured data in file system, but are increasingly storing it in object and cloud storage. Cloud and object storage offer many advantages, especially for inactive data. This means that files must be migrated or replicated (e.g. From legacy NAS to cloud or object storage. Cloud and object storage are becoming more popular. This has led to an underestimated security risk. Most data stored in the cloud and on-premises object storage are not backed up as it is believed to remain secure. This assumption is dangerous and negligent. Cloud services and object storage products offer redundancy and high availability, but they do not protect against human error or ransomware, malware, and other technological failures. Cloud and object data also need to be backup or replicated, most appropriate on a separate storage technology at a different location, in the original format, as stored in object storage and cloud services. -
35
Alooma
Google
Alooma allows data teams visibility and control. It connects data from all your data silos into BigQuery in real-time. You can set up and flow data in minutes. Or, you can customize, enrich, or transform data before it hits the data warehouse. Never lose an event. Alooma's safety nets make it easy to handle errors without affecting your pipeline. Alooma infrastructure can handle any number of data sources, low or high volume. -
36
Datagaps ETL Validator
Datagaps
DataOps ETL Validator, the most comprehensive ETL testing and data validation tool, is the most comprehensive ETL testing automation software. Comprehensive ETL/ELT Validation Tool to automate testing of data migration projects and data warehouses with an easy-to-use component-based user interface and low-code, zero-code test creation. ETL involves extracting data, transforming it according to operational needs, and then loading it into the target database or data store. ETL testing involves verifying accuracy, integrity and completeness of the data as it moves along the ETL process in order to ensure that it meets business requirements and rules. Automation of ETL testing is possible with tools that automate data validation, comparison, and transformation tests. This will speed up the testing cycle, reduce manual labor, and significantly accelerate the testing cycle. ETL Validator automates ETL tests by providing intuitive interfaces to create test cases without extensive programming. -
37
Hevo Data is a no-code, bi-directional data pipeline platform specially built for modern ETL, ELT, and Reverse ETL Needs. It helps data teams streamline and automate org-wide data flows that result in a saving of ~10 hours of engineering time/week and 10x faster reporting, analytics, and decision making. The platform supports 100+ ready-to-use integrations across Databases, SaaS Applications, Cloud Storage, SDKs, and Streaming Services. Over 500 data-driven companies spread across 35+ countries trust Hevo for their data integration needs.
-
38
Keboola Connection
Keboola
FreemiumKeboola is an open-source serverless integration hub for data/people, and AI models. We offer a cloud-based data integration platform designed to support all aspects of data extraction, cleaning and enrichment. The platform is highly collaborative and solves many of the most difficult problems associated with IT-based solutions. The seamless UI makes it easy for even novice business users to go from data acquisition to building a Python model in minutes. You should try us! You will love it! -
39
Data Migration Manager (DMM) for OutSystems automates data & BPT migration, export, import, data deletion or scramble/anonymization between all OutSystems environments (Cloud, Onprem, PaaS, Hybrid, mySQL, Oracle, SQL Server, Azure SQL, Java or .NET) and versions (8, 9, 10 or 11). Only solution with a FREE download from OS FORGE Did you... You need to upgrade servers and migrate apps, but now you must migrate the data & light BPT or BPT? To populate lookup data, you will need to migrate data from the Qual to Prod Environment. You need to move from Prod to Quali to replicate problems or get a good QA environment to test. Do you need to back up data to be able later to restore a demo environment? Do you need to import data from other systems into OutSystems? Do you need to validate performance? What is Infosistema DMM? https://www.youtube.com/watch?v=strh2TLliNc Reduce costs, reduce risks, and increase time-to market DMM is the fastest way to solve a problem!
-
40
Database Conversion and Synchronization Software Migrate Your Data Fast with Confidence. Support for more than 10 database engines; Cloud Platforms: Amazon RDS and Microsoft Azure SQL, Google Cloud, and Heroku; More Than 50 common migration directions; More that 1 million database records can easily be transferred in less than 5 minutes.
-
41
Carbonite Migrate
Carbonite
Carbonite Migrate makes it easy to migrate your physical, virtual, and cloud workloads to any environment with minimal downtime and virtually no risk. You have the option to automate each stage of the process with finely tuned automation. To minimize downtime and facilitate fast cutover, you can replicate data continuously. You can test the new environment unlimited times without interrupting operations. You can reduce downtime to minutes or seconds by switching to the new server. The console allows the administrator to select the source and destination servers after installing Carbonite Migrate. The administrator can choose how to migrate data. There are many options available, including fully automated cloud orchestration workflows or DIY with our robust SDK. This allows you to choose whether or not you want to lock yourself into any hypervisor, cloud vendor, or hardware. -
42
Xplenty
Xplenty Data Integration
Xplenty is a scalable data delivery and integration software that allows large businesses and SMBs to prepare and transfer data to the cloud for analytics. Xplenty features include data transformations and drag-and-drop interface. It also integrates with over 100 data storages and SaaS apps. Developers can easily add Xplenty to their data solution stack. Xplenty allows users to schedule jobs, track job progress, and keep track of job status. -
43
Impetus
Impetus
Due to multiple information sources operating in silos, the enterprise cannot find a single version of truth. Complexity is added by the confusion that results from hundreds of different solutions. While we provide the best possible solutions and services to solve the data and AI problems, you can concentrate on your business. Out-of-the box transformation accelerators for Teradata Netezza, Ab Initio Oracle, Teradata and other legacy data warehouses. View legacy code and evaluate the transformations to ETL, data warehouse, and analytics. Ingestion, CDC and streaming analytics, ETL and data prep, advanced analytics, and many more. Build and deploy scalable data science models and AI models across multiple platforms that leverage multiple data sources. A scalable, secure, fast and well-governed data lake that is agile and flexible can be built. Use best practices and accelerators to accelerate cloud adoption, implementation, and ROI. -
44
Supermetrics
Supermetrics
$29 per monthSupermetrics began with a bold idea: to make marketing data simple and accessible for businesses everywhere. What started as a small project has grown into a pioneering marketing intelligence platform trusted by over 200K organizations worldwide, including renowned brands like Nestlé, Warner Bros, and Dyson. From the beginning, Supermetrics has been driven by a mission to empower marketers and data analysts with seamless data access and mastery, no matter where they are on their journey. The platform has evolved into an easy-to-use solution that extracts and consolidates data from over 150 marketing and sales platforms—like Google Analytics, Facebook Ads, and HubSpot—into preferred destinations, helping teams streamline their analytics and make data-driven decisions. This dedication to innovation earned Supermetrics a spot on G2’s 2024 Top 50 Best EMEA Software Companies list. At the heart of Supermetrics is a commitment to transparency, innovation, and customer success. We believe data has the power to tell stories, solve problems, and create opportunities. As the marketing landscape evolves, Supermetrics remains committed to leading the way, helping clients not only succeed but excel with cutting-edge solutions. -
45
AWS DataSync
Amazon
AWS DataSync, a secure online service, automates and speeds up the transfer of data between on-premises storage services and AWS Storage. It simplifies migration plans and reduces costly on-premises data movements with a fully-managed service that scales seamlessly as data loads increase. DataSync can copy data from Network File System (NFS), Server Message Blocks (SMB), Hadoop Distributed File Systems, self-managed objects storage, AWS Snowcone to Amazon Simple Storage Service buckets (Amazon S3), Amazon Elastic File System File Systems (Amazon EFS), Amazon FSx file systems for Windows File Server, Amazon FSx file systems for Lustre, Amazon FSx file systems for OpenZFS, and Amazon FSx file systems for NetApp ONTAP. It can also move data between AWS Storage and other public clouds, allowing replication, archiving, or sharing application data. DataSync offers end-to-end data security, including encryption and data integrity. -
46
Syniti Data Replication
Syniti
Syniti Data Replication (formerly DBMoto), software makes it simple to implement heterogeneous Data Replication and Change Data Capture capabilities. Easy to use GUI and wizard-based screens allow you to deploy and run powerful data replication capabilities. There are no stored procedures to create, no proprietary syntax to learn and no programming required on the target or source database platforms. -
47
GenRocket
GenRocket
Enterprise synthetic test data solutions. It is essential that test data accurately reflects the structure of your database or application. This means it must be easy for you to model and maintain each project. Respect the referential integrity of parent/child/sibling relations across data domains within an app database or across multiple databases used for multiple applications. Ensure consistency and integrity of synthetic attributes across applications, data sources, and targets. A customer name must match the same customer ID across multiple transactions simulated by real-time synthetic information generation. Customers need to quickly and accurately build their data model for a test project. GenRocket offers ten methods to set up your data model. XTS, DDL, Scratchpad, Presets, XSD, CSV, YAML, JSON, Spark Schema, Salesforce. -
48
PeerDB
PeerDB
$250 per monthPeerDB is the fastest, easiest, and most cost-effective solution to replicate data between Postgres and data warehouses, queues and storage. Designed to scale at any size and tailored for data stores. PeerDB replays the schema messages using replication messages from Postgres replication slots. Alerts for slot expansion and connections. Native support for Postgres toast columns and large JSONB for IoT. Optimized query designs to reduce warehouse costs. Especially useful for Snowflake or BigQuery. Support for partitioned table via both publish. Transaction snapshotting and CTID scanning provide a consistent and fast initial load. High-availability with autoscaling, advanced logs, metrics, dashboards and monitoring dashboards. -
49
Artie
Artie
$231 per monthOnly send the data that is changed to the destination. Reduce computational overhead and eliminate data latency. Change data capture is a highly effective method of synchronizing data. Reduce compute costs by streaming only the data which has changed. Log-based replication allows you to replicate data in the real-time and without impacting source database performance. Install the end-toend solution in just minutes. No pipeline maintenance is required. Let your data teams focus on more valuable projects. Artie can be set up in a few easy steps. Artie will backfill historical data and stream continuously new changes to the final tables as they occur. Artie ensures high reliability and data consistency. Artie uses offsets to continue where it left off in the event of a failure. This helps maintain high data integrity without the need for re-syncs. -
50
DataOps DataFlow
Datagaps
Contact usApache Spark provides a holistic component-based platform to automate Data Reconciliation tests for modern Data Lake and Cloud Data Migration Projects. DataOps DataFlow provides a modern web-based solution to automate the testing of ETL projects, Data Warehouses, and Data Migrations. Use Dataflow to load data from a variety of data sources, compare the data, and load differences into S3 or a Database. Create and run dataflow quickly and easily. A top-of-the-class testing tool for Big Data Testing DataOps DataFlow integrates with all modern and advanced sources of data, including RDBMS and NoSQL databases, Cloud and file-based.