Best Kyligence Alternatives in 2025
Find the top alternatives to Kyligence currently available. Compare ratings, reviews, pricing, and features of Kyligence alternatives in 2025. Slashdot lists the best Kyligence alternatives on the market that offer competing products that are similar to Kyligence. Sort through Kyligence alternatives below to make the best choice for your needs
-
1
StarTree
StarTree
25 RatingsStarTree Cloud is a fully-managed real-time analytics platform designed for OLAP at massive speed and scale for user-facing applications. Powered by Apache Pinot, StarTree Cloud provides enterprise-grade reliability and advanced capabilities such as tiered storage, scalable upserts, plus additional indexes and connectors. It integrates seamlessly with transactional databases and event streaming platforms, ingesting data at millions of events per second and indexing it for lightning-fast query responses. StarTree Cloud is available on your favorite public cloud or for private SaaS deployment. StarTree Cloud includes StarTree Data Manager, which allows you to ingest data from both real-time sources such as Amazon Kinesis, Apache Kafka, Apache Pulsar, or Redpanda, as well as batch data sources such as data warehouses like Snowflake, Delta Lake or Google BigQuery, or object stores like Amazon S3, Apache Flink, Apache Hadoop, or Apache Spark. StarTree ThirdEye is an add-on anomaly detection system running on top of StarTree Cloud that observes your business-critical metrics, alerting you and allowing you to perform root-cause analysis — all in real-time. -
2
Satori
Satori
86 RatingsSatori is a Data Security Platform (DSP) that enables self-service data and analytics for data-driven companies. With Satori, users have a personal data portal where they can see all available datasets and gain immediate access to them. That means your data consumers get data access in seconds instead of weeks. Satori’s DSP dynamically applies the appropriate security and access policies, reducing manual data engineering work. Satori’s DSP manages access, permissions, security, and compliance policies - all from a single console. Satori continuously classifies sensitive data in all your data stores (databases, data lakes, and data warehouses), and dynamically tracks data usage while applying relevant security policies. Satori enables your data use to scale across the company while meeting all data security and compliance requirements. -
3
Looker
Google
20 RatingsLooker reinvents the way business intelligence (BI) works by delivering an entirely new kind of data discovery solution that modernizes BI in three important ways. A simplified web-based stack leverages our 100% in-database architecture, so customers can operate on big data and find the last mile of value in the new era of fast analytic databases. An agile development environment enables today’s data rockstars to model the data and create end-user experiences that make sense for each specific business, transforming data on the way out, rather than on the way in. At the same time, a self-service data-discovery experience works the way the web works, empowering business users to drill into and explore very large datasets without ever leaving the browser. As a result, Looker customers enjoy the power of traditional BI at the speed of the web. -
4
Google Cloud Platform
Google
Free ($300 in free credits) 25 RatingsGoogle Cloud is an online service that lets you create everything from simple websites to complex apps for businesses of any size. Customers who are new to the system will receive $300 in credits for testing, deploying, and running workloads. Customers can use up to 25+ products free of charge. Use Google's core data analytics and machine learning. All enterprises can use it. It is secure and fully featured. Use big data to build better products and find answers faster. You can grow from prototypes to production and even to planet-scale without worrying about reliability, capacity or performance. Virtual machines with proven performance/price advantages, to a fully-managed app development platform. High performance, scalable, resilient object storage and databases. Google's private fibre network offers the latest software-defined networking solutions. Fully managed data warehousing and data exploration, Hadoop/Spark and messaging. -
5
FlowWright
IPS
Business Process Management Software (BPMS & BPM Workflow Automation Software). Companies require support for workflow, forms, compliance, as well as automation routing. Low-code options make it easy to create and edit workflows. Our best-in class forms capabilities make it easy to quickly build forms, logic, and workflows for forms driven workflow processes. Many systems are already in place and need to be integrated. Our business process integrations between systems are loosely-coupled and intelligently integrated. FlowWright allows you to access standard metrics as well as those you create when automating your business. BPM analytics are an integral part of any BPM workflow management solution. FlowWright is available as a cloud solution, or in an on-premise or.NET hosted environment, including AWS and Azure. It was developed in.NET Foundation C# code. All tools are browser-based and do not require plug-ins. -
6
Pentaho+ is an integrated suite of products that provides data integration, analytics and cataloging. It also optimizes and improves quality. This allows for seamless data management and drives innovation and informed decisions. Pentaho+ helped customers achieve 3x more improved data trust and 7x more impactful business results, as well as a 70% increase productivity.
-
7
Unravel
Unravel Data
Unravel makes data available anywhere: Azure, AWS and GCP, or in your own datacenter. Optimizing performance, troubleshooting, and cost control are all possible with Unravel. Unravel allows you to monitor, manage and improve your data pipelines on-premises and in the cloud. This will help you drive better performance in the applications that support your business. Get a single view of all your data stack. Unravel gathers performance data from every platform and system. Then, Unravel uses agentless technologies to model your data pipelines end-to-end. Analyze, correlate, and explore all of your cloud and modern data. Unravel's data models reveal dependencies, issues and opportunities. They also reveal how apps and resources have been used, and what's working. You don't need to monitor performance. Instead, you can quickly troubleshoot issues and resolve them. AI-powered recommendations can be used to automate performance improvements, lower cost, and prepare. -
8
Striim
Striim
Data integration for hybrid clouds Modern, reliable data integration across both your private cloud and public cloud. All this in real-time, with change data capture and streams. Striim was developed by the executive and technical team at GoldenGate Software. They have decades of experience in mission critical enterprise workloads. Striim can be deployed in your environment as a distributed platform or in the cloud. Your team can easily adjust the scaleability of Striim. Striim is fully secured with HIPAA compliance and GDPR compliance. Built from the ground up to support modern enterprise workloads, whether they are hosted in the cloud or on-premise. Drag and drop to create data flows among your sources and targets. Real-time SQL queries allow you to process, enrich, and analyze streaming data. -
9
AtScale
AtScale
AtScale accelerates and simplifies business intelligence. This results in better business decisions and a faster time to insight. Reduce repetitive data engineering tasks such as maintaining, curating, and delivering data for analysis. To ensure consistent KPI reporting across BI tools, you can define business definitions in one place. You can speed up the time it takes to gain insight from data and also manage cloud compute costs efficiently. No matter where your data is located, you can leverage existing data security policies to perform data analytics. AtScale's Insights models and workbooks allow you to perform Cloud OLAP multidimensional analysis using data sets from multiple providers - without any data prep or engineering. To help you quickly gain insights that you can use to make business decisions, we provide easy-to-use dimensions and measures. -
10
Privacera
Privacera
Multi-cloud data security with a single pane of glass Industry's first SaaS access governance solution. Cloud is fragmented and data is scattered across different systems. Sensitive data is difficult to access and control due to limited visibility. Complex data onboarding hinders data scientist productivity. Data governance across services can be manual and fragmented. It can be time-consuming to securely move data to the cloud. Maximize visibility and assess the risk of sensitive data distributed across multiple cloud service providers. One system that enables you to manage multiple cloud services' data policies in a single place. Support RTBF, GDPR and other compliance requests across multiple cloud service providers. Securely move data to the cloud and enable Apache Ranger compliance policies. It is easier and quicker to transform sensitive data across multiple cloud databases and analytical platforms using one integrated system. -
11
Elasticsearch
Elastic
1 RatingElastic is a search company. Elasticsearch, Kibana Beats, Logstash, and Elasticsearch are the founders of the ElasticStack. These SaaS offerings allow data to be used in real-time and at scale for analytics, security, search, logging, security, and search. Elastic has over 100,000 members in 45 countries. Elastic's products have been downloaded more than 400 million times since their initial release. Today, thousands of organizations including Cisco, eBay and Dell, Goldman Sachs and Groupon, HP and Microsoft, as well as Netflix, Uber, Verizon and Yelp use Elastic Stack and Elastic Cloud to power mission critical systems that generate new revenue opportunities and huge cost savings. Elastic is headquartered in Amsterdam, The Netherlands and Mountain View, California. It has more than 1,000 employees in over 35 countries. -
12
Azure Analysis Services
Microsoft
$0.81 per hour 1 RatingAzure Resource Manager allows you to quickly create and deploy Azure Analysis Services instances. You can also use backup restore to quickly transfer your existing models to Azure Analysis Services. Take advantage of the flexibility, scale, and management benefits that the cloud offers. You can scale up, down, or pause the service, and only pay for what you use. Combine data from multiple sources to create a trusted BI semantic model that is easy to understand and use. Simplify the data view and its underlying structure to enable business users to self-service and discover data. Reduce the time it takes to gain insights on large and complex data sets. Your BI solution will respond quickly to your business's needs and keep up with your business. DirectQuery allows you to connect to real-time operational data and monitor the pulse of your company. Visualize your data using your favorite data visualization tool. -
13
E-MapReduce
Alibaba
EMR is an enterprise-ready big-data platform that offers cluster, job, data management and other services. It is based on open-source ecosystems such as Hadoop Spark, Kafka and Flink. Alibaba Cloud Elastic MapReduce is a big-data processing solution that runs on the Alibaba Cloud platform. EMR is built on Alibaba Cloud ECS and is based open-source Apache Spark and Apache Hadoop. EMR allows you use the Hadoop/Spark ecosystem components such as Apache Hive and Apache Kafka, Flink and Druid to analyze and process data. EMR can be used to process data stored on different Alibaba Cloud data storage services, such as Log Service (SLS), Object Storage Service(OSS), and Relational Data Service (RDS). It is easy to create clusters quickly without having to install hardware or software. Its Web interface allows you to perform all maintenance operations. -
14
Azure HDInsight
Microsoft
Run popular open-source frameworks--including Apache Hadoop, Spark, Hive, Kafka, and more--using Azure HDInsight, a customizable, enterprise-grade service for open-source analytics. You can process huge amounts of data quickly and enjoy all the benefits of the large open-source project community with the global scale Azure. You can easily migrate your big data workloads to the cloud. Open-source projects, clusters and other software are easy to set up and manage quickly. Big data clusters can reduce costs by using autoscaling and pricing levels that allow you only to use what you use. Data protection is assured by enterprise-grade security and industry-leading compliance, with over 30 certifications. Optimized components for open source technologies like Hadoop and Spark keep your up-to-date. -
15
PHEMI Health DataLab
PHEMI Systems
Unlike most data management systems, PHEMI Health DataLab is built with Privacy-by-Design principles, not as an add-on. This means privacy and data governance are built-in from the ground up, providing you with distinct advantages: Lets analysts work with data without breaching privacy guidelines Includes a comprehensive, extensible library of de-identification algorithms to hide, mask, truncate, group, and anonymize data. Creates dataset-specific or system-wide pseudonyms enabling linking and sharing of data without risking data leakage. Collects audit logs concerning not only what changes were made to the PHEMI system, but also data access patterns. Automatically generates human and machine-readable de- identification reports to meet your enterprise governance risk and compliance guidelines. Rather than a policy per data access point, PHEMI gives you the advantage of one central policy for all access patterns, whether Spark, ODBC, REST, export, and more -
16
Bizintel360
Bizdata
AI-powered self-service platform for advanced analytics. Without programming, connect data sources and create visualizations. Cloud native advanced analytics platform that delivers high-quality data supply, intelligent real-time analysis across enterprises without the need for programming. Connect data sources in different formats. Allows for the identification of root causes. Reduce cycle time from source to destination Analytics without programming knowledge Real-time data refresh on the move Connect any data source, stream data in real-time or at a defined frequency to the data lake, and visualize them in interactive search engine-based dashboards. With the power of the search engine and advanced visualization, you can perform predictive, prescriptive and descriptive analytics from one platform. No need to use traditional technology to view data in different visualization formats. Bizintel360 visualization allows you to slice, dice and combine data with different mathematical computations. -
17
Amazon EMR
Amazon
Amazon EMR is the market-leading cloud big data platform. It processes large amounts of data with open source tools like Apache Spark, Apache Hive and Apache HBase. EMR allows you to run petabyte-scale analysis at a fraction of the cost of traditional on premises solutions. It is also 3x faster than standard Apache Spark. You can spin up and down clusters for short-running jobs and only pay per second for the instances. You can also create highly available clusters that scale automatically to meet the demand for long-running workloads. You can also run EMR clusters from AWS Outposts if you have on-premises open source tools like Apache Spark or Apache Hive. -
18
Dremio
Dremio
Dremio provides lightning-fast queries as well as a self-service semantic layer directly to your data lake storage. No data moving to proprietary data warehouses, and no cubes, aggregation tables, or extracts. Data architects have flexibility and control, while data consumers have self-service. Apache Arrow and Dremio technologies such as Data Reflections, Columnar Cloud Cache(C3), and Predictive Pipelining combine to make it easy to query your data lake storage. An abstraction layer allows IT to apply security and business meaning while allowing analysts and data scientists access data to explore it and create new virtual datasets. Dremio's semantic layers is an integrated searchable catalog that indexes all your metadata so business users can make sense of your data. The semantic layer is made up of virtual datasets and spaces, which are all searchable and indexed. -
19
HEAVY.AI
HEAVY.AI
HEAVY.AI is a pioneer in accelerated analysis. The HEAVY.AI platform can be used by government and business to uncover insights in data that is beyond the reach of traditional analytics tools. The platform harnesses the huge parallelism of modern CPU/GPU hardware and is available both in the cloud or on-premise. HEAVY.AI was developed from research at Harvard and MIT Computer Science and Artificial Intelligence Laboratory. You can go beyond traditional BI and GIS and extract high-quality information from large datasets with no lag by leveraging modern GPU and CPU hardware. To get a complete picture of what, when and where, unify and explore large geospatial or time-series data sets. Combining interactive visual analytics, hardware accelerated SQL, advanced analytics & data sciences frameworks, you can find the opportunity and risk in your enterprise when it matters most. -
20
Keen
Keen.io
$149 per monthKeen is a fully managed event streaming platform. Our real-time data pipeline, built on Apache Kafka, makes it easy to collect large amounts of event data. Keen's powerful REST APIs and SDKs allow you to collect event data from any device connected to the internet. Our platform makes it possible to securely store your data, reducing operational and delivery risks with Keen. Apache Cassandra's storage infrastructure ensures data is completely secure by transferring it via HTTPS and TLS. The data is then stored with multilayer AES encryption. Access Keys allow you to present data in an arbitrary way without having to re-architect or re-architect the data model. Role-based Access Control allows for completely customizable permission levels, down to specific queries or data points. -
21
Sumo Logic
Sumo Logic
$270.00 per month 2 RatingsSumo Logic is a cloud-based solution for log management and monitoring for IT and security departments of all sizes. Integrated logs, metrics, and traces allow for faster troubleshooting. One platform. Multiple uses. You can increase your troubleshooting efficiency. Sumo Logic can help you reduce downtime, move from reactive to proactive monitoring, and use cloud-based modern analytics powered with machine learning to improve your troubleshooting. Sumo Logic Security Analytics allows you to quickly detect Indicators of Compromise, accelerate investigation, and ensure compliance. Sumo Logic's real time analytics platform allows you to make data-driven business decisions. You can also predict and analyze customer behavior. Sumo Logic's platform allows you to make data-driven business decisions and reduce the time it takes to investigate operational and security issues, so you have more time for other important activities. -
22
SigView
Sigmoid
Access granular data to make it easy to slice and dice billions of rows. Real-time reporting is possible in just seconds. Sigmoid's Sigview real-time data analytics tool is a plug-and-play solution that allows for exploratory data analysis. Sigview, which is built on Apache Spark, can drill down into large data sets in a matter of seconds. Around 30k people use Sigview to analyze billions in ad impressions. Sigview allows for real-time access both to programmatic and non-programmatic data. It creates real-time reports and analyses large data sets to provide real-time insight. Sigview is the best platform to help you optimize your ad campaigns, discover new inventory, or generate revenue opportunities in changing times. Connects to multiple data sources such as DFP, Pixel Servers and Audience, allowing you to ingest data from any format and location, with a data latency of less that 15 minutes. -
23
Sesame Software
Sesame Software
When you have the expertise of an enterprise partner combined with a scalable, easy-to-use data management suite, you can take back control of your data, access it from anywhere, ensure security and compliance, and unlock its power to grow your business. Why Use Sesame Software? Relational Junction builds, populates, and incrementally refreshes your data automatically. Enhance Data Quality - Convert data from multiple sources into a consistent format – leading to more accurate data, which provides the basis for solid decisions. Gain Insights - Automate the update of information into a central location, you can use your in-house BI tools to build useful reports to avoid costly mistakes. Fixed Price - Avoid high consumption costs with yearly fixed prices and multi-year discounts no matter your data volume. -
24
Oracle Big Data Preparation
Oracle
Oracle Big Data Preparation Cloud Service (PaaS), is a cloud-based managed Platform as a Service (PaaS). It allows you to quickly ingest, repair and enrich large data sets in an interactive environment. For down-stream analysis, you can integrate your data to other Oracle Cloud Services such as Oracle Business Intelligence Cloud Service. Oracle Big Data Preparation Cloud Service has important features such as visualizations and profile metrics. Visual access to profile results and summary for each column are available when a data set has been ingested. You also have visual access the duplicate entity analysis results on the entire data set. You can visualize governance tasks on the service homepage with easily understandable runtime metrics, data quality reports and alerts. Track your transforms to ensure that files are being processed correctly. The entire data pipeline is visible, from ingestion through enrichment and publishing. -
25
5X
5X
$350 per month5X is a data platform that offers everything you need to centralize your data, clean it, model it, and analyze it. 5X is designed to simplify data management. It offers seamless integration with more than 500 data sources. This ensures uninterrupted data movement between all your systems using pre-built connectors and custom connectors. The platform includes ingestion, warehousing and modeling, orchestration and business intelligence. All of this is presented in a simple-to-use interface. 5X supports a variety of data movements including SaaS applications, databases, ERPs and files. Data is transferred automatically and securely to data lakes and warehouses. 5X's enterprise-grade security encrypts the data at its source, identifying personally identifiable data and encrypting it at a column-level. The platform is designed for a 30% reduction in total cost of ownership compared to building a platform yourself. It also enhances productivity by providing a single interface for building end-to-end pipelines. -
26
Vertica
OpenText
The Unified Analytics Warehouse. The Unified Analytics Warehouse is the best place to find high-performing analytics and machine learning at large scale. Tech research analysts are seeing new leaders as they strive to deliver game-changing big data analytics. Vertica empowers data-driven companies so they can make the most of their analytics initiatives. It offers advanced time-series, geospatial, and machine learning capabilities, as well as data lake integration, user-definable extensions, cloud-optimized architecture and more. Vertica's Under the Hood webcast series allows you to dive into the features of Vertica - delivered by Vertica engineers, technical experts, and others - and discover what makes it the most scalable and scalable advanced analytical data database on the market. Vertica supports the most data-driven disruptors around the globe in their pursuit for industry and business transformation. -
27
Decision Moments
Mindtree
Mindtree Decision Moments is a data analytics platform that applies continuous learning algorithms to large data sets. Companies can gain valuable insights and improve their digital transformation by using this innovative sense-and respond system. Decision Moments is a flexible and customizable data intelligence platform that simplifies technological complexity. It can easily adapt to your organization's existing data analysis investment. It can be modified to adapt to changing market, technology or business needs. Decision Moments, powered by Microsoft Azure services and the Cortana Intelligence Suite in a cloud-native platform, allows you to reap the benefits and cost savings of data analytics platforms. Mindtree's Decision Moments gives your decision makers the platform they need for making sense of large amounts data from multiple sources. -
28
QuerySurge
RTTS
8 RatingsQuerySurge is the smart Data Testing solution that automates the data validation and ETL testing of Big Data, Data Warehouses, Business Intelligence Reports and Enterprise Applications with full DevOps functionality for continuous testing. Use Cases - Data Warehouse & ETL Testing - Big Data (Hadoop & NoSQL) Testing - DevOps for Data / Continuous Testing - Data Migration Testing - BI Report Testing - Enterprise Application/ERP Testing Features Supported Technologies - 200+ data stores are supported QuerySurge Projects - multi-project support Data Analytics Dashboard - provides insight into your data Query Wizard - no programming required Design Library - take total control of your custom test desig BI Tester - automated business report testing Scheduling - run now, periodically or at a set time Run Dashboard - analyze test runs in real-time Reports - 100s of reports API - full RESTful API DevOps for Data - integrates into your CI/CD pipeline Test Management Integration QuerySurge will help you: - Continuously detect data issues in the delivery pipeline - Dramatically increase data validation coverage - Leverage analytics to optimize your critical data - Improve your data quality at speed -
29
CENX Service Assurance
Ericsson
CENX Service Assurance lets you see the service topology, inventory and faults from all your disparate systems correlated into a single view. Operators can optimize hybrid communication networks and achieve closed-loop automation with this insight. Operators can launch new business models, provide next-generation services quickly, and support new technologies - like Network Functions Virtualization or Software-Defined Networking or SDN. They can also support 5G and Iot at a cost-effective rate. -
30
DataMax
Digiterre
DataMax is an enterprise-ready platform which takes the most complicated elements of real-time management and makes it simple to develop, deploy, and operate at scale. This allows for faster business change. DataMax is an innovative architecture, process, and combination of specific technologies that quickly moves an organisation from disparate information and reporting sources into one view of data. This provides the insight organisations need to run their businesses more effectively. The unique combination of technologies creates enterprise-level data management. This approach is Cloud-deployable and scaleable. This data includes both time series and non-time sequence data. These platforms provide an improvement in the quality and availability of data, analysis, and reporting to market analysts teams. Then they are provided to traders. -
31
Enterprise Enabler
Stone Bond Technologies
It unifies information across silos and scattered data for visibility across multiple sources in a single environment; whether in the cloud, spread across siloed databases, on instruments, in Big Data stores, or within various spreadsheets/documents, Enterprise Enabler can integrate all your data so you can make informed business decisions in real-time. By creating logical views from data starting at the source. This allows you to reuse, configure, test and deploy all your data in one integrated environment. You can analyze your business data as it happens to maximize the use and minimize costs, improve/refine business processes, and optimize the use of your assets. Our implementation time to market is between 50-90% shorter. We connect your sources so that you can make business decisions based upon real-time data. -
32
Arundo Enterprise
Arundo
Arundo Enterprise is a flexible, modular software suite that creates data products for people. We connect live data with machine learning and other analytic models, and model outputs are used to make business decisions. Arundo Edge Agent allows industrial connectivity and analytics in remote, rugged, and disconnected environments. Arundo Composer allows data analysts to quickly and easily deploy desktop-based models into the Arundo Fabric Cloud environment using a single command. Composer allows companies to create and manage data streams and integrate them with their deployed data models. Arundo Fabric, a cloud-based hub that deploys machine learning models, data streams and edge agent management, is Arundo Fabric. It also allows for quick navigation to extended applications. Arundo offers a range of high ROI SaaS products. Each solution comes with a core functional capability that leverages Arundo Enterprise's core strengths. -
33
VMware Tanzu Observability
Broadcom
Enterprise observability for all of your teams at scale Traditional tools only detect simple threshold-based anomalies. This makes it difficult to distinguish between real issues and false alarms. VMware Tanzu Observability from Wavefront allows you to create smart alerts that dynamically filter out noise and capture true anomalies. It is difficult to troubleshoot distributed cloud applications because of many moving parts, dependencies on other applications, and frequent code changes. Wavefront tracks all metrics from your cloud applications, infrastructure, and clouds. It can be difficult to find the right needle when dealing with thousands of metrics from containerized microservices and distributed cloud applications. AI Genie™, which automatically identifies "unknown unknowns", allows you to quickly find the root cause of an incident - isolate applications, infrastructure, and cloud. -
34
Phocas Software
Phocas Software
Phocas provides an all-in-one business intelligence (BI) and financial planning and analysis (FP&A) platform for mid-market businesses who make, move and sell. Driven by a mission to make people feel good about data, Phocas helps businesses connect, understand, and plan better together. Partnering with ERP systems like Epicor, Sage, Oracle NetSuite, Phocas extends their capabilities by consolidating ERP, CRM, spreadsheets and other data sources into one easy-to-use platform, offering a range of tools to analyze, report, and plan. Its key features include intuitive dashboards, ad hoc reporting, dynamic financial statements, flexible budgeting, accurate forecasting, and automated rebate management. With real-time insights and secure access, Phocas empowers cross-functional teams to explore data and make informed decisions confidently. Designed to be self-serve for all business users, Phocas simplifies data-driven processes by automating manual tasks like consolidating financial and operational data – saving time and reducing errors. Whether you're preparing month-end reports, analyzing trends, managing cash flow, or optimizing rebates, Phocas provides the clarity you need to stay ahead. -
35
Apcela Arcus
Apcela
Apcela manages and designs cloud-optimized, software defined enterprise networks. Apcela Arcus Platform enables enterprises to plug their existing network into a software defined WAN, which was designed for today's multicloud environment. Apcela Arcus Platform is a service that brings the speed and flexibility we love about cloud computing to the network that connects it. Apcela Arcus Connect, our core product, enables enterprises to extend their WAN across data centres, branch offices and distributed users. This fully managed service offers an SD-WAN overlay connected with Apcela's global network. It leverages cloud and internet gateways from 60+ AppHubs around the world to improve performance. Apcela Arcus Connect starts with an SD-WAN overlay that allows enterprises to route traffic based upon business needs. Internet traffic can be transferred directly to the public Internet. -
36
BigObject
BigObject
In-data computing is at the core of our innovation. It's a technology that allows us to process large quantities of data efficiently. BigObject, our flagship product, is a time-series database that was developed to handle massive data at high speed. Our core technology, in-data computing enabled us to launch BigObject. It can handle non-stop data streams and all their aspects quickly and continuously. BigObject is an in-data database designed for high-speed data storage and analysis. It has excellent performance and powerful query capabilities. It extends the relational data model to a time series model structure and uses in-data computing for database performance optimization. Our core technology is a model abstract in which all data are stored in an infinite memory space. -
37
AVEVA PI System
AVEVA
The PI System unlocks operational insight and new possibilities. The PI System enables digital transformation by providing trusted, high-quality operations information. Collect, enhance, or deliver data at any time, anywhere. Give operators and engineers the tools they need. Accelerate the work done by data scientists and analysts. Support new business opportunities. Real-time data collection from hundreds of assets, including legacy, remote, mobile, and IIoT. The PI System connects to your data regardless of where it is stored. You can store decades of data with subsecond resolution. You have immediate access to high-fidelity historical and real-time data. This allows you to keep your critical operations running smoothly and to gain business insights. Add intuitive metadata and labels to make data more meaningful. Create data hierarchies that are representative of your reporting and operating environments. Context is more than just a data point. It allows you to see the whole picture. -
38
Upsolver
Upsolver
Upsolver makes it easy to create a governed data lake, manage, integrate, and prepare streaming data for analysis. Only use auto-generated schema on-read SQL to create pipelines. A visual IDE that makes it easy to build pipelines. Add Upserts to data lake tables. Mix streaming and large-scale batch data. Automated schema evolution and reprocessing of previous state. Automated orchestration of pipelines (no Dags). Fully-managed execution at scale Strong consistency guarantee over object storage Nearly zero maintenance overhead for analytics-ready information. Integral hygiene for data lake tables, including columnar formats, partitioning and compaction, as well as vacuuming. Low cost, 100,000 events per second (billions every day) Continuous lock-free compaction to eliminate the "small file" problem. Parquet-based tables are ideal for quick queries. -
39
3DS OUTSCALE
Dassault Systèmes
Outscale is a leading Cloud provider that offers scalable, performant, and secure Private, Public, and On-Premise Clouds. They offer stellar customer service and have robust data centers all over the world. Founded in 2010 as a strategic partner with Dassault Systemes OUTSCALE has been certified CMSP Advanced (Cisco Systems), 100% ICT (Intel) and AltaVault (NetApp). The company offers Enterprise-Class Cloud Computing Services that are compliant with local and regulatory requirements. OUTSCALE helps businesses to increase their business agility, and to quickly deploy new business models that add value to their clients and internal customers. {OUTSCALE simplifies infrastructure complexities and boosts the business agility of its customers.|OUTSCALE reduces the complexity of infrastructure and increases the business agility for its customers.} -
40
Lentiq
Lentiq
Lentiq is a data lake that allows small teams to do big tasks. You can quickly run machine learning, data science, and data analysis at scale in any cloud. Lentiq allows your teams to ingest data instantly and then clean, process, and share it. Lentiq allows you to create, train, and share models within your organization. Lentiq allows data teams to collaborate and invent with no restrictions. Data lakes are storage and process environments that provide ML, ETL and schema-on-read querying capabilities. Are you working on data science magic? A data lake is a must. The big, centralized data lake of the Post-Hadoop era is gone. Lentiq uses data pools, which are interconnected, multi-cloud mini-data lakes. They all work together to provide a stable, secure, and fast data science environment. -
41
doolytic
doolytic
Doolytic is a leader in big data discovery, the convergence data discovery, advanced analytics and big data. Doolytic is bringing together BI experts to revolutionize self-service exploration of large data. This will unleash the data scientist in everyone. doolytic is an enterprise solution for native big data discovery. doolytic is built on open-source, scalable technologies that are best-of-breed. Lightening performance on billions and petabytes. Structured, unstructured, and real-time data from all sources. Advanced query capabilities for experts, Integration with R to enable advanced and predictive applications. With Elastic's flexibility, you can search, analyze, and visualize data in real-time from any format or source. You can harness the power of Hadoop data lakes without any latency or concurrency issues. doolytic solves common BI issues and enables big data discovery without clumsy or inefficient workarounds. -
42
SHREWD Platform
Transforming Systems
Our SHREWD Platform tools, open APIs and SHREWD Platform tools allow you to harness your entire system's data. SHREWD Platform is the platform that integrates and collects data from the SHREWD modules. The tools combine data and store it in our secure, UK-based database lake. The SHREWD modules and an API can access this data to transform it into meaningful information using targeted functions. SHREWD Platform can access data in almost any format. This includes analog data in spreadsheets and digital systems via APIs. Open APIs allow third-party connections to the data lake. SHREWD Platform is an operational data layer that provides a single source for the truth in real time. It allows SHREWD modules provide intelligent insights and managers and key decision-makers to take appropriate action at the right moment. -
43
OctoData
SoyHuCe
OctoData can be deployed in Cloud hosting at a lower price and includes personalized support, from the initial definition of your needs to the actual use of the solution. OctoData is built on open-source technologies that are innovative and can adapt to new possibilities. Its Supervisor provides a management interface that allows users to quickly capture, store, and exploit increasing amounts and varieties of data. OctoData allows you to quickly prototype and industrialize massive data recovery solutions, even in real-time, in a single environment. You can get precise reports, explore new options, increase productivity, and increase profitability by leveraging your data. -
44
Teradata Vantage
Teradata
Businesses struggle to find answers as data volumes increase faster than ever. Teradata Vantage™, solves this problem. Vantage uses 100 per cent of the data available to uncover real-time intelligence at scale. This is the new era in Pervasive Data Intelligence. All data across the organization is available in one place. You can access it whenever you need it using preferred languages and tools. Start small and scale up compute or storage to areas that have an impact on modern architecture. Vantage unifies analytics and data lakes in the cloud to enable business intelligence. Data is growing. Business intelligence is becoming more important. Four key issues that can lead to frustration when using existing data analysis platforms include: Lack of the right tools and supportive environment required to achieve quality results. Organizations don't allow or give proper access to the tools they need. It is difficult to prepare data. -
45
DataLux
Vivorbis
Data management and analytics platform that addresses data challenges and enables real-time decision making. DataLux includes plug-and-play adaptors that allow for the aggregation and visualization of large data sets. The data lake can be used to prevent new innovations. You can store data and make it available for data modeling. Containeristion can be used to create portable applications in a public, private, or on-premise cloud. Multiple time-series and inferred data can be combined, such as stock exchange tick data and stock market policy actions. You can also combine related and cross-industry data to extract causal information about stock market, macroeconomics, and other factors. By providing insights and guiding key decisions for product improvement, business decisions can be made. You can conduct interdisciplinary A/B tests across product design, engineering, and product development from ideation to decision-making. -
46
IBM Db2 Big SQL
IBM
A hybrid SQL-onHadoop engine that delivers advanced, security-rich data queries across enterprise big data sources including Hadoop object storage and data warehouses. IBM Db2 Big SQL, an enterprise-grade, hybrid ANSI compliant SQL-on-Hadoop engine that delivers massively parallel processing and advanced data query, is available. Db2 Big SQL allows you to connect to multiple sources, such as Hadoop HDFS and WebHDFS. RDMS, NoSQL database, object stores, and RDMS. You can benefit from low latency, high speed, data security, SQL compatibility and federation capabilities to perform complex and ad-hoc queries. Db2 Big SQL now comes in two versions. It can be integrated with Cloudera Data Platform or accessed as a cloud native service on the IBM Cloud Pak®. for Data platform. Access, analyze, and perform queries on real-time and batch data from multiple sources, including Hadoop, object stores, and data warehouses. -
47
DataPlay
Margasoft
DataPlay is a cloud-based suite of software that automates data management and analysis. It can analyze SPSS data directly from Excel and PowerPoint, which allows researchers to reduce the amount of manual work involved in the analysis and report preparation. -
48
Google Cloud Trace
Google
Cloud Trace is a distributed trace system that collects latency information from your applications and displays it within the Google Cloud Console. You can track the path of requests through your application and get detailed, near-real-time performance insights. Cloud Trace automatically analyzes your application's traces and generates detailed latency reports to identify performance degradations. It can also capture traces from all your VMs, containers or App Engine projects. Cloud Trace allows you to view aggregate latency data for your entire application or detail latency information for a single request. You can quickly identify the source of bottlenecks using the tools and filters available. Cloud Trace is based on the tools Google uses to keep its services running at an extreme scale. -
49
Oracle Big Data Service
Oracle
$0.1344 per hourCustomers can deploy Hadoop clusters in any size using Oracle Big Data Service. VM shapes range from 1 OCPU up to a dedicated bare-metal environment. Customers can choose between high-performance block storage or cost-effective block store, and can grow and shrink their clusters. Create Hadoop-based data lakes quickly to expand or complement customer data warehouses and ensure that all data can be accessed and managed efficiently. The included notebook supports R, Python, and SQL. Data scientists can query, visualize, and transform data to build machine learning models. Transfer customer-managed Hadoop clusters from a managed cloud-based service to improve resource utilization and reduce management costs. -
50
Tencent Cloud Elastic MapReduce
Tencent
EMR allows you to scale managed Hadoop clusters manually, or automatically, according to your monitoring metrics or business curves. EMR's storage computation separation allows you to terminate clusters to maximize resource efficiency. EMR supports hot failover on CBS-based nodes. It has a primary/secondary disaster recovery mechanism that allows the secondary node to start within seconds of the primary node failing, ensuring high availability of big data services. Remote disaster recovery is possible because of the metadata in Hive's components. High data persistence is possible with computation-storage separation for COS data storage. EMR comes with a comprehensive monitoring system that allows you to quickly locate and identify cluster exceptions in order to ensure stable cluster operations. VPCs are a convenient network isolation method that allows you to plan your network policies for managed Hadoop clusters.