data processing

Results 1 - 25 of 157Sort Results By: Published Date | Title | Company Name
Published By: Adobe     Published Date: Feb 20, 2014
San Diego County District Attorney’s Office accelerates Juvenile Court proceedings using Adobe® Acrobat® Pro in Microsoft SharePoint environment.
Tags : 
adobe, adobe acrobat pro, file management, software, data management, electronic documentation, paperless processing, pdf documents
    
Adobe
Published By: Adobe     Published Date: Feb 20, 2014
Gain more efficient ways of working with documents and collaborating with others on them.
Tags : 
adobe, adobe acrobat pro, microsoft applications, collaboration, merging documents, editing documents, pdf to office format, file formatting
    
Adobe
Published By: Adobe     Published Date: Aug 02, 2017
Customer experience (CX) continues to dominate the agenda, and it is clear from the research that retailers collectively grasp the importance of a personalised and mobilefriendly experience which is relevant at every stage of the customer journey. It makes absolute sense for retailers to focus on optimising the mobile experience as part of their CX initiatives. CX is a differentiator for retail brands and, as part of this trend, the mobile experience in particular will increasingly define your brand.
Tags : 
digital skills, culture, strategy, data management, processing, technology, ux design
    
Adobe
Published By: Amazon Web Services     Published Date: Feb 01, 2018
Moving Beyond Traditional Decision Support Future-proofing a business has never been more challenging. Customer preferences turn on a dime, and their expectations for service and support continue to rise. At the same time, the data lifeblood that flows through a typical organization is more vast, diverse, and complex than ever before. More companies today are looking to expand beyond traditional means of decision support, and are exploring how AI can help them find and manage the “unknown unknowns” in our fast-paced business environment.
Tags : 
predictive, analytics, data lake, infrastructure, natural language processing, amazon
    
Amazon Web Services
Published By: Amazon Web Services     Published Date: Sep 05, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources.
Tags : 
    
Amazon Web Services
Published By: AMD     Published Date: Jul 20, 2012
Server virtualization falls within the domain of IT operations but it can very much be a business matter with a measurable impact on the bottom line. Esoteric as it may seem from a business perspective, the server platform, powered by the processor.
Tags : 
virtualization, cloud connection, public cloud, private cloud, private/public hybrid, virtual power equation, data processing, virtual machine density
    
AMD
Published By: Automation Anywhere     Published Date: Feb 21, 2019
Automation Anywhere’s flagship product is Automation Anywhere Enterprise – a RPA platform offering a variety of tools to help organisations develop, operate and manage RPA bots that automate data entry, data gathering and other repetitive, routine tasks usually carried out as part of high-volume, repetitive work (for example, service fulfilment work in call centres, shared-service centres, and back-office processing environments). Automation Anywhere Enterprise bots can add value both in unattended (server-based, lights-out operation) and attended (desktop-based, interactive) deployment configurations. In this report, MWD Advisors digs deeper into the features and capabilities of Automation Anywhere’s product portfolio, analysing its fast-growth trajectory and highlighting large-scale implementations.
Tags : 
    
Automation Anywhere
Published By: AWS     Published Date: Jul 16, 2018
The Internet of Things (IoT) is composed of sensor-embedded devices and machines that exchange data with each other and the cloud through a secure network. Often referred to as “things” or “edge devices”, these intelligent machines connect to the internet either directly or through an IoT gateway, enabling them to send data to the cloud. Analyzing this data can reveal valuable insights about these objects and the business processes they’re part of, helping enterprises optimize their operations. Devices in IoT deployments can span nearly any industry or use case. Each one is equipped with sensors, processing power, connectivity, and software, enabling asset control and other remote interactions over the internet. Unlike traditional IT assets, these edge devices are resource-constrained (either by bandwidth, storage, or processing power) and are typically found outside of a data center, creating unique security and management considerations.
Tags : 
    
AWS
Published By: AWS     Published Date: Sep 05, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources.
Tags : 
    
AWS
Published By: AWS     Published Date: Oct 12, 2018
Safeguarding your data is more important than ever. In today’s data-driven business landscape, companies are using their data to innovate, inform product improvements, and personalize services for their customers. The sheer volume of data collected for these purposes keeps growing, but the solutions available to organizations for processing and analyzing it become more efficient and intuitive every day. Reaching the right customers at the right time with the right offers has never been easier. With this newfound agility, however, comes new opportunities for vulnerability. With so much riding on the integrity of your data and the services that make it secure and available, it’s crucial to have a plan in place for unexpected events that can wipe out your physical IT environment or otherwise compromise data access. The potential for natural disasters, malicious software attacks, and other unforeseen events necessitates that companies implement a robust disaster recovery (DR) strategy to
Tags : 
    
AWS
Published By: AWS     Published Date: Nov 14, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources. This e-book aims to provide you with expert tips on how to use Amazon Redshift Spectrum to increase performance and potentially reduce the cost of your queries.
Tags : 
    
AWS
Published By: AWS - ROI DNA     Published Date: Jun 12, 2018
Traditional data processing infrastructures—especially those that support applications—weren’t designed for our mobile, streaming, and online world. However, some organizations today are building real-time data pipelines and using machine learning to improve active operations. Learn how to make sense of every format of log data, from security to infrastructure and application monitoring, with IT Operational Analytics--enabling you to reduce operational risks and quickly adapt to changing business conditions.
Tags : 
    
AWS - ROI DNA
Published By: BlackBerry Cylance     Published Date: Apr 15, 2019
Artificial intelligence (AI) leads the charge in the current wave of digital transformation underway at many global companies. Organizations large and small are actively expanding their AI footprints as executives try to comprehend more fully what AI is and how they can use it to capitalize on business opportunities by gaining insight to the data they collect that enables them to engage with customers and hone a competitive edge. But, while AI may indeed be the frontier of enterprise technology, there remain many misconceptions about it. Part of the confusion stems from the fact that AI is an umbrella term that covers a range of technologies — including machine learning, computer vision, natural language processing, deep learning, and more — that are in various stages of development and deployment. The use of AI for dynamic pricing and targeted marketing has been in use for a while, but actual AI computing where machines think like humans is still many years from becoming mainstream. T
Tags : 
    
BlackBerry Cylance
Published By: BMC ASEAN     Published Date: Dec 18, 2018
Big data projects often entail moving data between multiple cloud and legacy on-premise environments. A typical scenario involves moving data from a cloud-based source to a cloud-based normalization application, to an on-premise system for consolidation with other data, and then through various cloud and on-premise applications that analyze the data. Processing and analysis turn the disparate data into business insights delivered though dashboards, reports, and data warehouses - often using cloud-based apps. The workflows that take data from ingestion to delivery are highly complex and have numerous dependencies along the way. Speed, reliability, and scalability are crucial. So, although data scientists and engineers may do things manually during proof of concept, manual processes don't scale.
Tags : 
    
BMC ASEAN
Published By: BMC Software     Published Date: May 28, 2014
Learn how BMC Control-M for for Hadoop allows IT to extract value from big data, with clear visibility, predictability and increased productivity and efficiency.
Tags : 
    
BMC Software
Published By: BMC Software     Published Date: May 28, 2014
"In the paper, “Integrate Big Data into Your Business Processes and Enterprise Systems” you’ll learn how to drive maximum value with an enterprise approach to Big Data. Topics discussed include: • How to ensure that your Big Data projects will drive clearly defined business value • The operational challenges each Big Data initiative must address • The importance of using an enterprise approach for Hadoop batch processing
Tags : 
    
BMC Software
Published By: CA Technologies     Published Date: Jul 20, 2017
Mainframes continue to provide high business value by combining efficient transaction processing with high-volume access to critical enterprise data. Business organizations are linking mobile devices to mainframe processing and data to support digital applications and drive business transformation. In this rapidly growing scenario, the importance of providing excellent end-user experience becomes critical for business success.This analyst announcement note covers how CA Technologies is addressing the need for providing high availability and a fast response time by optimizing mainframe performance with new machine learning and analytics capabilities.
Tags : 
    
CA Technologies
Published By: CA Technologies EMEA     Published Date: Aug 03, 2017
Using CA Live API Creator, you can execute business policies using Reactive Logic. You write simple declarative rules defining relationships across data fields, and they’re automatically enforced when changes occur—just like formulas in a spreadsheet. Reactive Logic should cover most of your application requirements, but you also have the ability to configure event processing or external callouts using server-side JavaScript or imported Java® libraries if you so desire.
Tags : 
api, application programming interface, psd2, open banking, json, github
    
CA Technologies EMEA
Published By: CA Technologies EMEA     Published Date: May 23, 2018
CA Live API Creator creates application back-ends exposing enterprise-class REST/JSON APIs, including access to existing data and applications. It enables developers to create new REST endpoints that join data across diverse data sources using a point–and–click approach. API owners can extend the API with declarative business rules, JavaScript event processing, role-based security and interactive testing. The CA Live API Creator Reactive Logic model yields systems that are highly scalable and reliable. Its optimized services run more efficiently and with less fragility than services manually coded by skilled developers and architects.
Tags : 
    
CA Technologies EMEA
Published By: Calpont     Published Date: Mar 13, 2012
In this white paper you will find insight as to how the Calpont InfiniDB database performs far beyond the abilities of a row-based database.
Tags : 
data warehouse, warehouse, benhcmark, infinidb, calpont, row-based, warehouse, mpp
    
Calpont
Published By: CheckMarx     Published Date: Sep 12, 2019
Financial services organizations operate under a host of regulatory standards. This makes sense, as the assets and information managed by these firms are valuable, sensitive, and targeted by sophisticated cyber attackers daily. Compounding these challenges is the large volume of personally identifiable information (PII) that financial organizations handle regularly. PII is subject to many compliance regulations, notably the General Data Protection Regulation (GDPR), which regulates not only the processing of personal data, including PII, relating to individuals in the EU, for also any organization that processes personal data of EU residents. For US banking consumers, Section 5 (Unfair or Deceptive Acts or Practices) of the Federal Trade Commission Act and numerous state regulations enforce basic consumer protections, which financial organizations must also uphold.
Tags : 
    
CheckMarx
Published By: Cisco     Published Date: Dec 15, 2010
Understanding frequently encountered data workflow processing pain points and new strategies for addressing them.
Tags : 
cisco, business intelligence, etl, process management pain point, process management, data workflow, enterprise data delivery, interactive analytic
    
Cisco
Published By: Cisco     Published Date: Jun 21, 2016
The Cisco® Hyperlocation Solution is the industry’s first Wi-Fi network-based location system that can help businesses and users pinpoint a user’s location to within one to three meters, depending on the deployment. Combining innovative RF antenna and module design, faster and more frequent data processing, and a powerful platform for customer engagement, it can help businesses create more personalized and profitable customer experiences.
Tags : 
    
Cisco
Published By: Cisco EMEA     Published Date: Nov 13, 2017
Big data and analytics is a rapidly expanding field of information technology. Big data incorporates technologies and practices designed to support the collection, storage, and management of a wide variety of data types that are produced at ever increasing rates. Analytics combine statistics, machine learning, and data preprocessing in order to extract valuable information and insights from big data.
Tags : 
big data, analytics, virtualization, cloudera, ibm, sas, sap, splunk
    
Cisco EMEA
Published By: ClearStory     Published Date: Oct 07, 2014
Organizations are more data hungry than ever. Thanks to advances in machine learning and semantic processing, they can now gain new insights from that data. ClearStory Data helps business users gain new insights into their markets and the environments in which they operate.
Tags : 
data hungry, semantic processing, insight, market enviornment
    
ClearStory
Start   Previous   1 2 3 4 5 6 7    Next    End
Search      
Already a subscriber? Log in here
Please note you must now log in with your email address and password.