Best Xeotek Alternatives in 2024
Find the top alternatives to Xeotek currently available. Compare ratings, reviews, pricing, and features of Xeotek alternatives in 2024. Slashdot lists the best Xeotek alternatives on the market that offer competing products that are similar to Xeotek. Sort through Xeotek alternatives below to make the best choice for your needs
-
1
StarTree
StarTree
25 RatingsStarTree Cloud is a fully-managed real-time analytics platform designed for OLAP at massive speed and scale for user-facing applications. Powered by Apache Pinot, StarTree Cloud provides enterprise-grade reliability and advanced capabilities such as tiered storage, scalable upserts, plus additional indexes and connectors. It integrates seamlessly with transactional databases and event streaming platforms, ingesting data at millions of events per second and indexing it for lightning-fast query responses. StarTree Cloud is available on your favorite public cloud or for private SaaS deployment. StarTree Cloud includes StarTree Data Manager, which allows you to ingest data from both real-time sources such as Amazon Kinesis, Apache Kafka, Apache Pulsar, or Redpanda, as well as batch data sources such as data warehouses like Snowflake, Delta Lake or Google BigQuery, or object stores like Amazon S3, Apache Flink, Apache Hadoop, or Apache Spark. StarTree ThirdEye is an add-on anomaly detection system running on top of StarTree Cloud that observes your business-critical metrics, alerting you and allowing you to perform root-cause analysis — all in real-time. -
2
Streaming service is a streaming service that allows developers and data scientists to stream real-time events. It is serverless and Apache Kafka compatible. Streaming can be integrated with Oracle Cloud Infrastructure, Database, GoldenGate, Integration Cloud, and Oracle Cloud Infrastructure (OCI). The service provides integrations for hundreds third-party products, including databases, big data, DevOps, and SaaS applications. Data engineers can easily create and manage big data pipelines. Oracle manages all infrastructure and platform management, including provisioning, scaling and security patching. Streaming can provide state management to thousands of consumers with the help of consumer groups. This allows developers to easily create applications on a large scale.
-
3
TreasuryPay
TreasuryPay
Instant™, Enterprise Data and Intelligence. All transaction data is visible, as it happens, from anywhere in the world. Organizations can access worldwide accounting, liquidity management, marketing, and supply chain information with just one network connection. This allows them to be empowered with enterprise intelligence. The TreasuryPay product set streams global receivables information and provides instant accountancy as well as cognitive services. It is simply the most advanced intelligence platform and insights platform available to global organizations. You can instantly provide enriched information to your entire global organization. It's easy to make the change. The Return on Investment is remarkable. With TreasuryPay Instant™, you can now access actionable intelligence and global accountancy in real-time. -
4
Rockset
Rockset
FreeReal-time analytics on raw data. Live ingest from S3, DynamoDB, DynamoDB and more. Raw data can be accessed as SQL tables. In minutes, you can create amazing data-driven apps and live dashboards. Rockset is a serverless analytics and search engine that powers real-time applications and live dashboards. You can directly work with raw data such as JSON, XML and CSV. Rockset can import data from real-time streams and data lakes, data warehouses, and databases. You can import real-time data without the need to build pipelines. Rockset syncs all new data as it arrives in your data sources, without the need to create a fixed schema. You can use familiar SQL, including filters, joins, and aggregations. Rockset automatically indexes every field in your data, making it lightning fast. Fast queries are used to power your apps, microservices and live dashboards. Scale without worrying too much about servers, shards or pagers. -
5
Materialize
Materialize
$0.98 per hourMaterialize is a reactive database that provides incremental view updates. Our standard SQL allows developers to easily work with streaming data. Materialize connects to many external data sources without any pre-processing. Connect directly to streaming sources such as Kafka, Postgres databases and CDC or historical data sources such as files or S3. Materialize allows you to query, join, and transform data sources in standard SQL - and presents the results as incrementally-updated Materialized views. Queries are kept current and updated as new data streams are added. With incrementally-updated views, developers can easily build data visualizations or real-time applications. It is as easy as writing a few lines SQL to build with streaming data. -
6
Redpanda
Redpanda Data
You can deliver customer experiences like never before with breakthrough data streaming capabilities Both the ecosystem and Kafka API are compatible. Redpanda BulletPredictable low latency with zero data loss. Redpanda BulletUp to 10x faster than Kafka Redpanda BulletEnterprise-grade support and hotfixes. Redpanda BulletAutomated backups for S3/GCS. Redpanda Bullet100% freedom of routine Kafka operations. Redpanda BulletSupports for AWS/GCP. Redpanda was built from the ground up to be easy to install and get running quickly. Redpanda's power will be evident once you have tried it in production. You can use the more advanced Redpanda functions. We manage all aspects of provisioning, monitoring, as well as upgrades. We do not have access to your cloud credentials. Sensitive data never leaves your environment. You can have it provisioned, operated, maintained, and updated for you. Configurable instance types. As your needs change, you can expand the cluster. -
7
Axual
Axual
Axual provides Kafka-as-a-Service to DevOps teams. Our intuitive Kafka platform will empower your team to unlock insights, drive decisions and improve productivity. Axual is the ideal solution for enterprises that want to seamlessly integrate data streaming with their core IT infrastructure. Our all-in one Kafka platform was designed to eliminate the requirement for extensive technical skills or knowledge, and provide a ready-made product that delivers all of the benefits of event-streaming without the hassle. The Axual Platform, an all-in-one platform, is designed to simplify and enhance the deployment and management of Apache Kafka real-time streaming data. The Axual Platform offers a wide range of features to meet the needs of modern enterprises. This allows organizations to maximize the potential of data streaming, while minimizing complexity. -
8
Azure Data Explorer
Microsoft
$0.11 per hourAzure Data Explorer provides fast, fully managed data analytics services for real-time analysis of large amounts of data streaming from websites, applications, IoT devices, etc. Ask questions and iteratively analyze data on the fly to improve products and customer experiences, monitor devices, boost operations, and increase profits. Identify patterns, anomalies, or trends quickly in your data. Find answers to your questions quickly and easily by exploring new topics. The optimized cost structure allows you to run as many queries as needed. You can explore new possibilities with your data efficiently. With the fully managed, easy-to-use data analytics service, you can focus on insights and not infrastructure. Rapidly respond to rapidly changing and fast-flowing data. Azure Data Explorer simplifies analytics for all types of streaming data. -
9
Azure Event Hubs
Microsoft
$0.03 per hourEvent Hubs is a fully managed, real time data ingestion service that is simple, reliable, and scalable. Stream millions of events per minute from any source to create dynamic data pipelines that can be used to respond to business problems. Use the geo-disaster recovery or geo-replication features to continue processing data in emergencies. Integrate seamlessly with Azure services to unlock valuable insights. You can allow existing Apache Kafka clients to talk to Event Hubs with no code changes. This allows you to have a managed Kafka experience, without the need to manage your own clusters. You can experience real-time data input and microbatching in the same stream. Instead of worrying about infrastructure management, focus on gaining insights from your data. Real-time big data pipelines are built to address business challenges immediately. -
10
KX Streaming Analytics allows you to ingest and store historical and time series data. This data can then be processed, stored, processed, and analyzed instantly to provide analytics, insights, or visualizations. The platform offers the complete lifecycle of data services to ensure that your applications and users can be productive quickly. This includes query processing, query processing, tiering and migration, archiving and data protection. Our advanced analytics and visualization tools are widely used in finance and industry. They allow you to create and execute queries, calculations, aggregations and machine learning on any streaming or historical data. Data can be used across multiple hardware environments and can come from high-volume sources such as clickstreams, radio frequency identification, GPS systems and social networking sites.
-
11
Esper Enterprise Edition
EsperTech Inc.
Esper Enterprise Edition is a distributed platform for horizontal and linear elastic scalability, fault-tolerant event processing, and fault tolerance. -
12
GigaSpaces
GigaSpaces
Smart DIH is a data management platform that quickly serves applications with accurate, fresh and complete data, delivering high performance, ultra-low latency, and an always-on digital experience. Smart DIH decouples APIs from SoRs, replicating critical data, and making it available using event-driven architecture. Smart DIH enables drastically shorter development cycles of new digital services, and rapidly scales to serve millions of concurrent users – no matter which IT infrastructure or cloud topologies it relies on. XAP Skyline is a distributed in-memory development platform that delivers transactional consistency, combined with extreme event-based processing and microsecond latency. The platform fuels core business solutions that rely on instantaneous data, including online trading, real-time risk management and data processing for AI and large language models. -
13
Kapacitor
InfluxData
$0.002 per GB per hourKapacitor, a native data processing engine in InfluxDB 1.x, is an integral component of the InfluxDB 2.0 platform. Kapacitor is able to process both batch and stream data from InfluxDB. It can also act on these data in real time via its programming language TICKscript. Modern applications need more than operator alerts and dashboarding. They also require the ability to trigger actions. Kapacitor's alerting system uses a publish-subscribe design. Alerts are sent to topics, and subscribers subscribe to a topic. Kapacitor is very flexible and can be used to control your environment. It can perform tasks such as stock reordering and auto-scaling. Kapacitor has a simple plugin architecture (or interface) that allows it integrate with any anomaly detector engine. -
14
Amazon MSK
Amazon
$0.0543 per hourAmazon MSK is a fully managed service that makes coding and running applications that use Apache Kafka for streaming data processing easy. Apache Kafka is an open source platform that allows you to build real-time streaming data applications and pipelines. Amazon MSK allows you to use native Apache Kafka APIs for populating data lakes, stream changes between databases, and to power machine learning or analytics applications. It is difficult to set up, scale, and manage Apache Kafka clusters in production. Apache Kafka clusters can be difficult to set up and scale on your own. -
15
Gretel
Gretel.ai
Privacy engineering tools delivered as APIs. In minutes, you can synthesize and transform data. Trust your users and the community. Gretel's APIs allow you to instantly create anonymized or synthetic data sets so that you can safely work with data while protecting your privacy. Access to data must be faster in order to keep up with the development pace. Gretel's data privacy tools bypass blockers, and allow for Machine Learning and AI applications to access data faster. Gretel Cloud runners makes it easy to scale up your workloads to the cloud or keep your data safe by running Gretel containers within your own environment. Developers will find it much easier to train and create synthetic data using our cloud GPUs. Scale workloads instantly with no infrastructure required. Invite colleagues to collaborate on cloud projects, and share data between teams. -
16
WarpStream
WarpStream
$2,987 per monthWarpStream, an Apache Kafka compatible data streaming platform, is built directly on object storage. It has no inter-AZ network costs, no disks that need to be managed, and it's infinitely scalable within your VPC. WarpStream is deployed in your VPC as a stateless, auto-scaling binary agent. No local disks are required to be managed. Agents stream data directly into and out of object storage without buffering on local drives and no data tiering. Instantly create new "virtual" clusters in our control plan. Support multiple environments, teams or projects without having to manage any dedicated infrastructure. WarpStream is Apache Kafka protocol compatible, so you can continue to use your favorite tools and applications. No need to rewrite or use a proprietary SDK. Simply change the URL of your favorite Kafka library in order to start streaming. Never again will you have to choose between budget and reliability. -
17
DeltaStream
DeltaStream
DeltaStream is an integrated serverless streaming processing platform that integrates seamlessly with streaming storage services. Imagine it as a compute layer on top your streaming storage. It offers streaming databases and streaming analytics along with other features to provide an integrated platform for managing, processing, securing and sharing streaming data. DeltaStream has a SQL-based interface that allows you to easily create stream processing apps such as streaming pipelines. It uses Apache Flink, a pluggable stream processing engine. DeltaStream is much more than a query-processing layer on top Kafka or Kinesis. It brings relational databases concepts to the world of data streaming, including namespacing, role-based access control, and enables you to securely access and process your streaming data, regardless of where it is stored. -
18
Confluent
Confluent
Apache Kafka®, with Confluent, has an infinite retention. Be infrastructure-enabled, not infrastructure-restricted Legacy technologies require you to choose between being real-time or highly-scalable. Event streaming allows you to innovate and win by being both highly-scalable and real-time. Ever wonder how your rideshare app analyses massive amounts of data from multiple sources in order to calculate real-time ETA. Wondering how your credit card company analyzes credit card transactions from all over the world and sends fraud notifications in real time? Event streaming is the answer. Microservices are the future. A persistent bridge to the cloud can enable your hybrid strategy. Break down silos to demonstrate compliance. Gain real-time, persistent event transport. There are many other options. -
19
Digital Twin Streaming Service
ScaleOut Software
ScaleOut Digital Twin Streaming Service™ Easily create and deploy real-time twins for streaming analytics Connect with many data sources with Azure & AWS IoT Hubs, Kafka, etc. Maximize situational awareness through live, aggregate analytics. A breakthrough cloud service that simultaneously tracks telemetry across millions of data sources, with "real-time digital twins" -- enabling deep, immediate introspection and state-tracking for thousands of devices. The powerful UI makes deployment easy and displays aggregate analytics in real-time to maximize situational awareness. Ideal for a wide variety of applications, including the Internet of Things, real-time intelligent monitoring and logistics, financial services, and financial services. Simple pricing makes it easy to get started. The ScaleOut Digital Twin Builder software and ScaleOut Digital Twin Streaming Service enable the next generation of stream processing. -
20
Oracle Stream Analytics
Oracle
Oracle Stream Analytics makes it possible to analyze and process large amounts of real-time data using complex correlation patterns, enrichment and machine learning. It provides real-time, actionable business insight from streaming data and automates actions to drive agile businesses of today. -
21
Apache Flink
Apache Software Foundation
Apache Flink is a distributed processing engine and framework for stateful computations using unbounded and bounded data streams. Flink can be used in all cluster environments and perform computations at any scale and in-memory speed. A stream of events can be used to produce any type of data. All data, including credit card transactions, machine logs, sensor measurements, and user interactions on a website, mobile app, are generated as streams. Apache Flink excels in processing both unbounded and bound data sets. Flink's runtime can run any type of application on unbounded stream streams thanks to its precise control of state and time. Bounded streams are internal processed by algorithms and data structure that are specifically designed to process fixed-sized data sets. This results in excellent performance. Flink can be used with all of the resource managers previously mentioned. -
22
IBM StreamSets
IBM
$1000 per monthIBM® StreamSets allows users to create and maintain smart streaming data pipelines using an intuitive graphical user interface. This facilitates seamless data integration in hybrid and multicloud environments. IBM StreamSets is used by leading global companies to support millions data pipelines, for modern analytics and intelligent applications. Reduce data staleness, and enable real-time information at scale. Handle millions of records across thousands of pipelines in seconds. Drag-and-drop processors that automatically detect and adapt to data drift will protect your data pipelines against unexpected changes and shifts. Create streaming pipelines for ingesting structured, semistructured, or unstructured data to deliver it to multiple destinations. -
23
Lenses
Lenses.io
$49 per monthAllow everyone to view and discover streaming data. Up to 95% of productivity can be increased by sharing, documenting, and cataloging data. Next, create apps for production use cases using the data. To address privacy concerns and cover all the gaps in open source technology, apply a data-centric security approach. Secure and low-code data pipeline capabilities. All darkness is eliminated and data and apps can be viewed with unparalleled visibility. Unify your data technologies and data meshes and feel confident using open source production. Independent third-party reviews have rated Lenses the best product for real time stream analytics. We have built features to allow you to focus on what is driving value from real-time data. This was based on feedback from our community as well as thousands of engineering hours. You can deploy and run SQL-based real-time applications over any Kafka Connect, Kubernetes or Kubernetes infrastructure, including AWS EKS. -
24
Apama
Apama
Apama Streaming Analytics enables organizations to analyze and respond to IoT and fast moving data in real time, allowing them to react intelligently to events as they occur. Apama Community Edition by Software AG is a freemium version that allows users to learn about, develop, and implement streaming analytics applications. Software AG Data & Analytics Platform offers an integrated, modular and end-to-end set of world-class capabilities that are optimized for high-speed data management. It also provides connectivity and integration to all major enterprise data sources. You can choose the capabilities that you require: streaming, predictive, and visual analytics. There is also messaging and integration with other enterprise applications. You can integrate historical and other data to create models and enrich customer data. -
25
Google Cloud Dataflow
Google
Unified stream and batch data processing that is serverless, fast, cost-effective, and low-cost. Fully managed data processing service. Automated provisioning of and management of processing resource. Horizontal autoscaling worker resources to maximize resource use Apache Beam SDK is an open-source platform for community-driven innovation. Reliable, consistent processing that works exactly once. Streaming data analytics at lightning speed Dataflow allows for faster, simpler streaming data pipeline development and lower data latency. Dataflow's serverless approach eliminates the operational overhead associated with data engineering workloads. Dataflow allows teams to concentrate on programming and not managing server clusters. Dataflow's serverless approach eliminates operational overhead from data engineering workloads, allowing teams to concentrate on programming and not managing server clusters. Dataflow automates provisioning, management, and utilization of processing resources to minimize latency. -
26
SQLstream
Guavus, a Thales company
In the field of IoT stream processing and analytics, SQLstream ranks #1 according to ABI Research. Used by Verizon, Walmart, Cisco, and Amazon, our technology powers applications on premises, in the cloud, and at the edge. SQLstream enables time-critical alerts, live dashboards, and real-time action with sub-millisecond latency. Smart cities can reroute ambulances and fire trucks or optimize traffic light timing based on real-time conditions. Security systems can detect hackers and fraudsters, shutting them down right away. AI / ML models, trained with streaming sensor data, can predict equipment failures. Thanks to SQLstream's lightning performance -- up to 13 million rows / second / CPU core -- companies have drastically reduced their footprint and cost. Our efficient, in-memory processing allows operations at the edge that would otherwise be impossible. Acquire, prepare, analyze, and act on data in any format from any source. Create pipelines in minutes not months with StreamLab, our interactive, low-code, GUI dev environment. Edit scripts instantly and view instantaneous results without compiling. Deploy with native Kubernetes support. Easy installation includes Docker, AWS, Azure, Linux, VMWare, and more -
27
Amazon Kinesis
Amazon
You can quickly collect, process, analyze, and analyze video and data streams. Amazon Kinesis makes it easy for you to quickly and easily collect, process, analyze, and interpret streaming data. Amazon Kinesis provides key capabilities to process streaming data at any scale cost-effectively, as well as the flexibility to select the tools that best fit your application's requirements. Amazon Kinesis allows you to ingest real-time data, including video, audio, website clickstreams, application logs, and IoT data for machine learning, analytics, or other purposes. Amazon Kinesis allows you to instantly process and analyze data, rather than waiting for all the data to be collected before processing can begin. Amazon Kinesis allows you to ingest buffer and process streaming data instantly, so you can get insights in seconds or minutes, instead of waiting for hours or days. -
28
Solix Test Data Management
Solix Technologies
Accuracy in test data is key to application development and testing quality. This is why most application development teams insist that test data be populated frequently from production databases. A Test Data Management (TDM), program may have six to eight full copies/clones of the production database to use as development and test instances. Provisioning test data without proper automation is inefficient, time-consuming, and storage-intensive. It could also expose sensitive data to unauthorized personnel, leading to compliance risks. Cloning creates a data governance problem and a resource drain. Test and development databases can also be not updated regularly enough, leading to inaccurate test results and even failures. The cost of application development rises as errors are discovered later in the development cycle. -
29
Apache Spark
Apache Software Foundation
Apache Spark™, a unified analytics engine that can handle large-scale data processing, is available. Apache Spark delivers high performance for streaming and batch data. It uses a state of the art DAG scheduler, query optimizer, as well as a physical execution engine. Spark has over 80 high-level operators, making it easy to create parallel apps. You can also use it interactively via the Scala, Python and R SQL shells. Spark powers a number of libraries, including SQL and DataFrames and MLlib for machine-learning, GraphX and Spark Streaming. These libraries can be combined seamlessly in one application. Spark can run on Hadoop, Apache Mesos and Kubernetes. It can also be used standalone or in the cloud. It can access a variety of data sources. Spark can be run in standalone cluster mode on EC2, Hadoop YARN and Mesos. Access data in HDFS and Alluxio. -
30
Delphix
Delphix
Delphix is the industry leader for DataOps. It provides an intelligent data platform that accelerates digital change for leading companies around world. The Delphix DataOps Platform supports many systems, including mainframes, Oracle databases, ERP apps, and Kubernetes container. Delphix supports a wide range of data operations that enable modern CI/CD workflows. It also automates data compliance with privacy regulations such as GDPR, CCPA and the New York Privacy Act. Delphix also helps companies to sync data between private and public clouds, accelerating cloud migrations and customer experience transformations, as well as the adoption of disruptive AI technologies. -
31
KX Insights
KX
KX Insights, a cloud-native platform that provides critical real-time performance as well as continuous actionable intelligence, is available. It enables rapid decision-making and automated responses to incidents by combining complex event processing, high speed analytics, and machine learning interfaces. Cloud computing is not just about storage and compute elasticity. It encompasses everything: data and tools, development, security, connectivity as well as operations and maintenance. KX can help you harness that power to make better, more informed decisions by integrating real time analytics into your business operations. KX Insights uses industry standards to ensure interoperability and openness with other technologies to deliver insights faster, more cost-effectively. It uses microservices to capture, store, and process high-volume, high velocity data using cloud protocols, services, and standards. -
32
Mockaroo
Mockaroo
$50 per yearWithout making real API requests, it's difficult to create a meaningful UI prototype. Real requests will help you identify issues with timing, application flow, and API design. This will improve both the user experience as well as the API's quality. Mockaroo allows you to create your own mock APIs. You can control the URLs, response times, and error conditions. Paralellize UI development and API development to deliver better applications sooner! There are many great data mocking libraries for every language and platform. Not everyone is a programmer and not everyone has the time or desire to learn new frameworks. Mockaroo makes it easy to download large quantities of randomly generated test data, based on your specifications. This data can then be loaded directly into your test environment using SQL and CSV formats. -
33
Organizations can support their business goals by managing data well over its life span. Decommissioned applications and historical transaction records can be archived. However, you will still have access to the data for queries and reporting. You can scale data across applications, operating systems, and hardware platforms to improve security, speed up release cycles, and reduce costs. Data archiving is critical to the performance of mission-critical enterprise systems. Reduce data growth issues at the source, increase efficiency, and minimize the risks associated managing structured data throughout its life. Unstructured data can be protected in the testing, development, and analytics environments throughout an enterprise.
-
34
IBM Streams
IBM
1 RatingIBM Streams analyzes a wide range of streaming data, including unstructured text, video and audio, and geospatial and sensor data. This helps organizations to spot opportunities and risks, and make decisions in real-time. -
35
Informatica Data Engineering Streaming
Informatica
AI-powered Informatica Data Engineering streaming allows data engineers to ingest and process real-time streaming data in order to gain actionable insights. -
36
Embiot
Telchemy
Embiot®, a compact, high-performance IoT analytics software agent that can be used for smart sensor and IoT gateway applications, is available. This edge computing application can be integrated directly into devices, smart sensor and gateways but is powerful enough to calculate complex analytics using large amounts of raw data at high speeds. Embiot internally uses a stream processing model in order to process sensor data that arrives at different times and in different order. It is easy to use with its intuitive configuration language, rich in math, stats, and AI functions. This makes it quick and easy to solve any analytics problems. Embiot supports many input methods, including MODBUS and MQTT, REST/XML and REST/JSON. Name/Value, CSV, and REST/XML are all supported. Embiot can send output reports to multiple destinations simultaneously in REST, custom text and MQTT formats. Embiot supports TLS on select input streams, HTTP, and MQTT authentication for security. -
37
SAS Event Stream Processing
SAS Institute
Streaming data from operations and transactions is valuable when it is well-understood. SAS Event stream processing includes streaming data quality, analytics, and a vast array SAS and open-source machine learning and high frequency analytics for connecting to, deciphering and cleansing streaming data. It doesn't matter how fast your data moves or how many sources you pull from, all of it is under your control through a single, intuitive interface. You can create patterns and address situations from any aspect of your business, giving the ability to be agile and deal with issues as they arise. -
38
Kinetica
Kinetica
A cloud database that can scale to handle large streaming data sets. Kinetica harnesses modern vectorized processors to perform orders of magnitude faster for real-time spatial or temporal workloads. In real-time, track and gain intelligence from billions upon billions of moving objects. Vectorization unlocks new levels in performance for analytics on spatial or time series data at large scale. You can query and ingest simultaneously to take action on real-time events. Kinetica's lockless architecture allows for distributed ingestion, which means data is always available to be accessed as soon as it arrives. Vectorized processing allows you to do more with fewer resources. More power means simpler data structures which can be stored more efficiently, which in turn allows you to spend less time engineering your data. Vectorized processing allows for incredibly fast analytics and detailed visualizations of moving objects at large scale. -
39
ERBuilder
Softbuilder
$49ERBuilder Data Modeler, a GUI data modeling tool, allows developers to visualize, design and model databases using entity relationship diagrams. It automatically generates the most common SQL databases. Share the data model documentation with your team. You can optimize your data model with advanced features like schema comparison, schema synchronization, and test data generation. -
40
DTM Data Generator
DTM Data Generator
With over 70 functions and an expression processor, the product allows users to quickly create test data using dependencies, internal structure and relationships. The product automatically analyzes an existing database schema and resolves master detail key structure (relationships). Value Library is a predefined data set that includes names, countries and cities, currencies, companies, sectors, departments, regions, and more. Named Generators and Variables feature allow you to share data generation properties with similar columns. Intelligent schema analyzer makes your data more realistic without any additional project modifications. Data by example makes data more realistic and less time-consuming. -
41
Alibaba Cloud DataHub
Alibaba Cloud
DataHub supports many SDKs, APIs, and offers multiple third-party plug ins like Flume and Logstash. DataHub allows you to import data in a fast and efficient way. DataConnector module allows you to sync imported data to downstream storage systems and analysis systems in real-time, such as MaxCompute OSS, Tablestore, and OSS. DataHub allows you to import heterogeneous data from websites, applications, IoT devices, and databases in real-time. DataHub allows you to manage your data in a single place. DataHub allows you to also send the data to downstream systems like analysis systems or archiving systems. This allows you to create a data streaming pipeline that extracts more data value. -
42
GenRocket
GenRocket
Enterprise synthetic test data solutions. It is essential that test data accurately reflects the structure of your database or application. This means it must be easy for you to model and maintain each project. Respect the referential integrity of parent/child/sibling relations across data domains within an app database or across multiple databases used for multiple applications. Ensure consistency and integrity of synthetic attributes across applications, data sources, and targets. A customer name must match the same customer ID across multiple transactions simulated by real-time synthetic information generation. Customers need to quickly and accurately build their data model for a test project. GenRocket offers ten methods to set up your data model. XTS, DDL, Scratchpad, Presets, XSD, CSV, YAML, JSON, Spark Schema, Salesforce. -
43
Solace PubSub+
Solace
Solace is a specialist in Event-Driven-Architecture (EDA), with two decades of experience providing enterprises with highly reliable, robust and scalable data movement technology based on the publish & subscribe (pub/sub) pattern. Solace technology enables the real-time data flow behind many of the conveniences you take for granted every day such as immediate loyalty rewards from your credit card, the weather data delivered to your mobile phone, real-time airplane movements on the ground and in the air, and timely inventory updates to some of your favourite department stores and grocery chains, not to mention that Solace technology also powers many of the world's leading stock exchanges and betting houses. Aside from rock solid technology, stellar customer support is one of the biggest reasons customers select Solace, and stick with them. -
44
Cloudera DataFlow
Cloudera
You can manage your data from the edge to the cloud with a simple, no-code approach to creating sophisticated streaming applications. -
45
Cumulocity IoT
Software AG
Cumulocity IoT, the #1 low-code, self service IoT platform, is pre-integrated and includes all the tools you need to get fast results: device connectivity, management, application enablement, integration, streaming, and predictive analytics. Your business can no longer depend on proprietary technology. You can connect any "thing" to the IoT platform because it is completely open. You can bring your own hardware and tools and choose the components that fit you best. In minutes, you can be up and running with the IoT. Connect a device to view its data. Create a real-time interactive dashboard. You can create rules to monitor and respond to events. All of this can be done without requiring IT or writing code. You can easily integrate new IoT data into the core enterprise systems, apps and processes that have been running your business for years, again without having to code - for a fluid flow data. You will have more context to make smarter decisions. -
46
Tonic
Tonic
Tonic automatically creates mock datasets that preserve key characteristics of secure data sets so that data scientists, developers, and salespeople can work efficiently without revealing their identities. Tonic creates safe, de-identified data from your production data. Tonic models your production data from your production data to help tell a similar story in your testing environments. Safe and useful data that is scaled to match your real-world data. Safely share data across businesses, teams, and borders to create data that is identical to your production data. PII/PHI identification and obfuscation. Protect your sensitive data by proactive protection with automatic scanning, alerts and de-identification. Advanced subsetting across diverse database types. Fully automated collaboration, compliance, and data workflows. -
47
Datanamic Data Generator
Datanamic
€59 per monthDatanamic Data Generator allows developers to quickly populate databases with thousands upon rows of meaningful, syntactically correct data for database testing purposes. A blank database is useless for testing your application. Test data is essential. It is difficult to create your own test data generators and scripts. Datanamic Data Generator can help. This tool is available for developers, DBAs, and testers who require sample data to test a database-driven app. Datanamic Data Generator makes it easy to generate database test data. It will read your database and display tables and columns according to their data generation settings. To generate complete (realistic) test data, only a few entries are required. This tool can be used to create test data from scratch, or from existing data. -
48
Informatica Test Data Management
Informatica
We can help you find, create, and subset data for test data; visualize test coverage; and protect data so that you can concentrate on development. Automate provisioning synthetically, subsetted, or masked data for development and testing purposes. Consistent masking across databases and within databases allows you to quickly identify sensitive data locations. To improve testers' efficiency, store, augment, share, or reuse test datasets. To reduce infrastructure requirements and increase performance, provision smaller data sets. Our comprehensive collection of masking techniques can be used to protect data across applications. To ensure solution integrity and speed deployments, support packaged applications. To align with data governance initiatives, engage risk, compliance, audit and other teams. Test efficiency can be improved with reliable, trusted production data sets. Server and storage footprints can be reduced with data sets that are targeted for each team. -
49
Qlik Gold Client
Qlik
Qlik Gold Client increases the efficiency, security, and cost of managing SAP test data. Qlik Gold Client eliminates development workarounds by allowing you to easily move configuration, master, or transactional data subsets into test environments. Rapidly create, copy and synchronize transactional information from production to nonproduction targets. Non-production data can be identified, selected, and deleted. A simple interface allows you to manage complex and powerful data transformations. Automate data selection and allow hands-free refresh cycles of test data, reducing the time and effort required for test data management. Qlik Gold Client offers several options to protect PII information in non-production environments through data masking. Data masking is a set of rules that "scrambles" your production data when it is replicated to non-production environments. -
50
BMC Compuware Topaz for Enterprise Data
BMC Software
Visualize large numbers of data objects, understand their relationships and tune related data extracts for optimal test data. Comparing files on different LPARs allows you to quickly and easily assess the impact of your changes. Developers and testers can simplify the complex task of managing and preparing data for testing. This allows them to perform data-related tasks without having to write programs or scripts, code SQL or use multiple utilities. Developers, analysts, and test engineers can be more self-sufficient by enabling them to provision data as needed, decreasing reliance on subject matter specialists. Better testing scenarios can improve application quality, making it easier to create data extracts for testing purposes and accurately identifying the effects of data changes.