data application

Results 351 - 375 of 1283Sort Results By: Published Date | Title | Company Name
Published By: CA Technologies EMEA     Published Date: Aug 03, 2017
The GDPR is set to have wide-ranging implications for the type of data which can be used in non-production environments. Organizations will need to understand exactly what data they have and who’s using it, and must be able to restrict its use to tasks for which consent has been given.
Tags : 
organization scope, data controllers, technology, gdpr, application regulation, corporate rules, contractual clauses, anonymize data, privacy projects
    
CA Technologies EMEA
Published By: CA Technologies EMEA     Published Date: Aug 03, 2017
Using CA Live API Creator, you can execute business policies using Reactive Logic. You write simple declarative rules defining relationships across data fields, and they’re automatically enforced when changes occur—just like formulas in a spreadsheet. Reactive Logic should cover most of your application requirements, but you also have the ability to configure event processing or external callouts using server-side JavaScript or imported Java® libraries if you so desire.
Tags : 
api, application programming interface, psd2, open banking, json, github
    
CA Technologies EMEA
Published By: CA Technologies EMEA     Published Date: Aug 07, 2017
CA API Management is a robust, enterprise-grade solution that can enable the success of your API initiatives. The software provides industry-leading tools to rapidly create APIs from existing data assets, orchestrate legacy services and safely expose enterprise applications and services. The solution also allows you to quickly onboard, manage and enable the developers who will create innovative apps that add value to your business. And, just as importantly, CA API Management secures your enterprise data to meet the toughest compliance and regulatory standards, while providing you with full control over which apps, developers and partners can access your APIs.
Tags : 
api, application programming interface, psd2, open banking, json, github
    
CA Technologies EMEA
Published By: CA Technologies EMEA     Published Date: Aug 07, 2017
Generate rich virtual data that covers the full range of possible scenarios and provide the unconstrained access to environments needed to deliver rigorously tested applications on time and within budget. Model complex live system data and apply automated rule-learning algorithms to pay off technical debt and uncover in depth understanding of composite applications, while exposing virtual data to distributed teams on demand and avoiding testing bottlenecks.
Tags : 
virtual services, data, avoid project delays, ca technologies, continuous testing, testing effort, service virtualisation, delivery ecosystem
    
CA Technologies EMEA
Published By: Schneider Electric     Published Date: Feb 12, 2018
Internet use is trending towards bandwidth-intensive content and an increasing number of attached “things”. At the same time, mobile telecom networks and data networks are converging into a cloud computing architecture. To support needs today and tomorrow, computing power and storage is being inserted out on the network edge in order to lower data transport time and increase availability. Edge computing brings bandwidth-intensive content and latency-sensitive applications closer to the user or data source. This white paper explains the drivers of edge computing and explores the various types of edge computing available.
Tags : 
    
Schneider Electric
Published By: Splunk     Published Date: Nov 29, 2018
DevOps allows teams to effectively build, test, release, and respond to your software. But creating an agile, data-driven culture is easier said than done. Developer and devops teams struggle with lack of visibility into application monitoring tools and systems, accelerated time-to-market pressure, and increased complexity throughout the devops lifecycle process. As a Splunk customer, how are you using your machine data platform to adopt DevOps and optimize your application delivery pipeline? Download your copy of Driving DevOps Success With Data to learn: How machine data can optimize your application delivery The four key capabilities DevOps teams must have to optimize speed and customer satisfaction Sample metrics to measure your DevOps processes against
Tags : 
devops, devops tools, continuous delivery, devops methodology
    
Splunk
Published By: Splunk     Published Date: Nov 29, 2018
From protecting customer experience to preserving lines of revenue, IT operations teams face increasingly complex responsibilities and are responsible for preventing outages that could harm the organization. As a Splunk customer, your machine data platform empowers you to utilize machine learning to reduce MTTR. Discover how six companies utilize machine learning and AI to predict outages, protect business revenue and deliver exceptional customer experiences. Download the e-book to learn how: Micron Technology reduced number of IT incidents by more than 50% Econocom provides better customer service by centralizing once-siloed analytics, improving SLA performance and significantly reducing the number of events TransUnion combines machine data from multiple applications to create an end-to-end transaction flow
Tags : 
predictive it, predictive it tools, predictive analytics for it, big data and predictive analytics
    
Splunk
Published By: Stratoscale     Published Date: Feb 01, 2019
This eBook offers a practical hands-on guide to analyzing and mitigating the risks of migrating to PostgreSQL. With the ongoing shift towards open-source database solutions, it’s no surprise that PostgreSQL is the fastest growing database. While it’s tempting to simply compare the licensing costs of proprietary systems against that of open source, it is both a misleading and incorrect approach when evaluating the potential for return on investment of a database technology migration. A key decision criteria for adopting any technology is whether it can support requirements for existing applications while also fitting into longer term strategies and needs. The first section of this eBook provides a detailed analysis of all aspects of migrating from legacy and commercial solutions to PostgreSQL: ? Schema and code migration ? Data migration ? Application code migration ? Testing and evaluation
Tags : 
    
Stratoscale
Published By: Stratoscale     Published Date: Feb 01, 2019
This eBook offers a practical hands-on guide to “Day Two” challenges of accelerating large-scale PostgreSQL deployments. With the ongoing shift towards open-source database solutions, it’s no surprise that PostgreSQL is the fastest growing database. While it’s tempting to simply compare the licensing costs of proprietary systems against that of open source, it is both a misleading and incorrect approach when evaluating the potential for return on investment of a database technology migration. After a PostgreSQL deployment is live, there are a variety of day-two scenarios that require planning and strategizing. The third section of this eBook provides a detailed analysis of all aspects accelerating large-scale PostgreSQL deployments: ? Backups and Availability: strategies, point-in-time recovery, availability and scalability ? Upgrades and DevOps: PostgreSQL upgrade process, application upgrades and CI/CD
Tags : 
    
Stratoscale
Published By: Teradata     Published Date: Jan 27, 2015
There is little question about the role that SAP® BW has historically played in the SAP® infrastructure. It has been a key element in unlocking SAP’s vast and complex store of operational data housed in their ERP applications. One can easily think of BW as the original Rosetta Stone for this data. For without it, end users and IT shops would have faced daunting coding tasks attempting to cull the proper elements out of the SAP® ERP code for reporting. However, BW and the associated tools around it were rudimentary to begin with, and have not advanced to where it meets the pressing demands of todays’ end users. The New Rosetta Stone 2.0 for SAP® ERP Data —And More. Download now!
Tags : 
rosetta stone, data, erp, sap®, teradata, it management
    
Teradata
Published By: Here Technologies     Published Date: Apr 02, 2019
In this report, VSI applies HERE’s HD map data to a lane keeping application and examines performance of lane keeping with a map-based approach compared to a camera and computer vision-based approach. VSI tested the lane keeping system with and without map data on a local road in 3 scenarios: Lane lines expanding into a turn or exit lane An intersection without lane lines A widening in the lane The results show that in all scenarios, the computer-vision-only lane keeping systems got confused and made errors in a vehicle’s trajectory when lane markings were out of the ordinary or invisible. Faced with the same road conditions, the map-based lane keeping system stayed within the desired trajectory outperforming the compute- vision-only systems. This report proves that using a lane model from an HD map can solve common issues involved in computer-vision-only lane keeping.
Tags : 
over the air technologies, location data, auto, mapping
    
Here Technologies
Published By: Riverbed     Published Date: May 19, 2016
Data protection, application performance, and business continuity are essential to staying competitive and profitable.
Tags : 
data, data protection, data centre, security, data security, application security, disaster recovery, security management, security policies, database development, database security, data loss prevention
    
Riverbed
Published By: Pure Storage     Published Date: Jul 26, 2017
The big breakthrough is coming from one of the leading innovators in the all-flash market, Pure Storage. Pure has scaled its architecture to make all of its key enterprise-class features available in an entry-level array that “democratizes” all-flash storage by making it affordable to just about any business. The new product is called the Pure FlashArray//m10.
Tags : 
enterprise reliability, flash arrays, cloud data platform, pure storage, data growth, application performance, customer needs, mobile
    
Pure Storage
Published By: Pure Storage     Published Date: Oct 09, 2018
Apache® Spark™ has become a vital technology for development teams looking to leverage an ultrafast in-memory data engine for big data analytics. Spark is a flexible open-source platform, letting developers write applications in Java, Scala, Python or R. With Spark, development teams can accelerate analytics applications by orders of magnitude
Tags : 
    
Pure Storage
Published By: Pure Storage     Published Date: Apr 10, 2019
Deep learning opens up new worlds of possibility in artificial intelligence, enabled by advances in computational capacity, the explosion in data, and the advent of deep neural networks. But data is evolving quickly and legacy storage systems are not keeping up. Advanced AI applications require a modern all-flash storage infrastructure that is built specifically to work with high-powered analytics.
Tags : 
    
Pure Storage
Published By: Forcepoint     Published Date: Apr 20, 2016
Nel 2014, le violazioni dei dati hanno compromesso più di 700 milioni di informazioni, con perdite finanziarie stimate di almeno 400 milioni di dollari e gli incidenti di sicurezza sono aumentati fino al 66%. Ma non lasciare che la paura soffochi la crescita. Il report “Data Theft Prevention” si concentra su come innovare in completa sicurezza da una prospettiva più ampia, più mirata e più intelligente nell'applicazione.
Tags : 
security, data theft prevention, data security, security application, anti spam, anti spyware, anti virus, application security, internet security, intrusion prevention, security management
    
Forcepoint
Published By: Forcepoint     Published Date: Apr 20, 2016
En 2014, les fuites de données ont compromis plus de 700 millions de dossiers, engendrant des pertes financières estimées à au moins 400 millions $, et la fréquence des incidents liés à la sécurité est passée à 66 pour cent. Mais ne laissez pas la crainte étouffer votre croissance. Le rapport sur la prévention contre le vol de données se penche principalement sur la manière de vous protéger lorsque vous innovez, mais selon une perspective plus large en termes de portée et plus intelligente dans l’application.
Tags : 
security, data theft prevention, data security, security application, anti spam, anti spyware, anti virus, application security, internet security, intrusion detection, security policies
    
Forcepoint
Published By: Forcepoint     Published Date: May 16, 2016
2014 wurden mehr als 700 Millionen Datensätze durch Verletzungen der Datenintegrität kompromittiert. Die finanziellen Verluste werden auf mindestens 400 Mio. USD geschätzt. Gleichzeitig stieg die Anzahl der Sicherheitsvorfälle um 66 Prozent. Lassen Sie nicht zu, dass Ihr Wachstum durch Angst gebremst wird. Der Bericht "Verhinderung von Datendiebstahl" konzentriert sich darauf, wie Sie aus breiterer, intelligenterer Anwendungssicht sicher bleiben, während Sie Innovationen vorantreiben.
Tags : 
security, data theft prevention, data security, security application, anti spam, anti spyware, anti virus, application security, intrusion detection, security policies
    
Forcepoint
Published By: Pure Storage     Published Date: Mar 15, 2018
Managing technology refreshes is not a popular task among enterprise storage administrators, although it is a necessary task for successful businesses. As a business evolves, managing more data and adding new applications in the process, enterprise storage infrastructure inevitably needs to grow in performance and capacity. Enterprise storage solutions have traditionally imposed limitations in terms of their ability to easily accommodate technology refreshes that keep infrastructure current and operating reliably and most cost effectively. In 2015, Pure Storage introduced a new technology refresh model that has driven strong change in the enterprise storage industry by addressing the major pain points of legacy models and provided overall a much more cost-effective life-cycle management approach. In conjunction with other aspects of Pure Storage's enterprise storage product and services offerings, the company's "Evergreen Storage" technology refresh model has contributed to this all-f
Tags : 
    
Pure Storage
Published By: IBM     Published Date: Jan 27, 2017
A solid information integration and governance program must become a natural part of big data projects, supporting automated discovery, profiling and understanding of diverse data sets to provide context and enable employees to make informed decisions. It must be agile to accommodate a wide variety of data and seamlessly integrate with diverse technologies, from data marts to Apache Hadoop systems. And it must automatically discover, protect and monitor sensitive information as part of big data applications.
Tags : 
    
IBM
Published By: IBM     Published Date: Apr 18, 2017
The data integration tool market was worth approximately $2.8 billion in constant currency at the end of 2015, an increase of 10.5% from the end of 2014. The discipline of data integration comprises the practices, architectural techniques and tools that ingest, transform, combine and provision data across the spectrum of information types in the enterprise and beyond — to meet the data consumption requirements of all applications and business processes. The biggest changes in the market from 2015 are the increased demand for data virtualization, the growing use of data integration tools to combine "data lakes" with existing integration solutions, and the overall expectation that data integration will become cloud- and on-premises-agnostic.
Tags : 
data integration, data security, data optimization, data virtualization, database security, data analytics, data innovation
    
IBM
Published By: IBM     Published Date: Sep 28, 2017
Here are the 6 reasons to change your database: Lower total cost of ownership Increased scalability and availability Flexibility for hybrid environments A platform for rapid reporting and analytics Support for new and emerging applications Greater simplicity Download now to learn more!
Tags : 
scalability, hybrid environment, emerging applications, rapid reporting
    
IBM
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
As organizations develop next-generation applications for the digital era, many are using cognitive computing ushered in by IBM Watson® technology. Cognitive applications can learn and react to customer preferences, and then use that information to support capabilities such as confidence-weighted outcomes with data transparency, systematic learning and natural language processing. To make the most of these next-generation applications, you need a next-generation database. It must handle a massive volume of data while delivering high performance to support real-time analytics. At the same time, it must provide data availability for demanding applications, scalability for growth and flexibility for responding to changes.
Tags : 
database, applications, data availability, cognitive applications
    
Group M_IBM Q1'18
Published By: Oracle     Published Date: Jan 28, 2019
This report provides an overview of the Oracle Cloud at Customer portfolio (this includes Oracle Exadata Cloud at Customer, Oracle Big Data Cloud at Customer, Oracle SaaS at Customer and Oracle Cloud at Customer) and analyzes its capabilities to satisfy the need of enterprises for a next-generation computing platform. A next-generation computing platform allows enterprises to deploy workloads across the premises and the public cloud. For enterprises running their next-generation applications on a next-generation computing platform, Oracle Cloud at Customer does very well because of Oracle’s vision of the “chip-to-click” integrated technology stack (i.e., from the CPU silicon, across all OSI layers and all the way to the end-user mouse click). With Oracle using the same technology stack and machines both in its cloud and on premises, it has the highest degree of identicality across these offerings from all vendors that are part of Constellation Research’s Market Overview on next-genera
Tags : 
    
Oracle
Published By: Oracle     Published Date: Jan 28, 2019
Traditionally, the best practice for mission-critical Oracle Database backup and recovery was to use storage-led, purpose-built backup appliances (PBBAs) such as Data Domain, integrated with RMAN, Oracle’s automated backup and recovery utility. This disk-based backup approach solved two problems: 1) It enabled faster recovery (from disk versus tape) 2) It increased recovery flexibility by storing many more backups online, enabling restoration from that data to recover production databases; and provisioning copies for test/dev. At its core, however, this approach remains a batch process that involves many dozens of complicated steps for backups and even more steps for recovery. Oracle’s Zero Data Loss Recovery Appliance (RA) customers report that total cost of ownership (TCO) and downtime costs (e.g. lost revenue due to database or application downtime) are significantly reduced due to the simplification and, where possible, the automation of the backup and recovery process.
Tags : 
    
Oracle
Start   Previous    8 9 10 11 12 13 14 15 16 17 18 19 20 21 22    Next    End
Search      

Search the Library

Add Research

Get your company's research in the hands of targeted business professionals.