CWjobs.co.uk

Welcome to CWJobs jobseeker   Register   or   Sign in    0 Saved jobs

10,380 ads for IT jobs from 783 companies

CWJobs Whitepaper Directory

Choose from hundreds of technology whitepapers from leading industry experts including IBM, Cisco, Symantec and Citrix.

Covering IT categories including Data Management, Networking, Security and much more; CWJobs Whitepaper Directory is a great source for technical whitepapers and IT information. What's more, it is completely free of charge

Stay in touch with the latest IT information, trends and research today!

large database

Results 26 - 45 of 45Sort Results By: Published Date | Title | Company Name
Published By: HP     Published Date: Jul 25, 2008
Countless companies want to take advantage of Microsoft SQL Server 2005 and its notable business and technology benefits. However, with the prevalence of large, complex and sprawling information technology (IT) infrastructures, many organizations are wary of - or avoid altogether - the time, expense and effort required for a mass overhaul of their database environment.
Tags : 
sql, sql server, migration, infrastructure, server migration, polyserve, data management, network management, servers
    
HP
Published By: Microsoft Dynamics     Published Date: Jul 03, 2008
Microsoft Dynamics CRM 4.0 demonstrated its ability to scale to support the needs of an enterprise organization with a very large customer service database. In a test based on a customer database of over 1 billion records, Microsoft Dynamics CRM was able to achieve sub-second response times using a modest hardware configuration.
Tags : 
crm, enterprise crm, customer relationship management, enterprise software, microsoft, microsoft dynamics, microsoft crm
    
Microsoft Dynamics
Published By: Webroot     Published Date: Sep 18, 2013
This whitepaper explains how exploding mobile threats challenge mobile device management (MDM) vendors, carriers, service providers, and app stores to ensure the safety of apps. Consumers and customers will hold these companies responsible for providing adequate security, and employees will expect enterprise IT departments to protect them from malicious mobile apps like they have come to expect for PC malware. The report describes how a cloud-based app reputation service protects against the risks of mobile applications in the wild: • Collects millions of applications from a variety of sources • Stores mobile app data in the world’s largest cloud-based threat database • Analyzes and scores apps on a range from malicious (e.g., known malware such as Trojans or root kits) to trustworth
Tags : 
mobile app, mobile threat, mobile device management, mobile application management, malicious mobile app, security, business technology
    
Webroot
Published By: Webroot     Published Date: Sep 18, 2013
This whitepaper deals with the rise of mobility, BYOD and social networking, and how these trends have led cybercriminals to exploit vulnerabilities in browsers and mobile apps. For example, more than 30,000 mostly legitimate websites become infected with malware every day. From drive-by downloads to spearfishing to XML injection, web-borne threats represent a significant new risk for businesses. The report describes how to stay on top of this changing threat landscape and prevent damaging attacks with: • 100% protection against known viruses • Industry-leading URL filtering and IP protection via the world’s largest threat database • Extended protection for smartphones and tablets • Simplified web-based management
Tags : 
web security, cybercriminals, web threats, web gateway, security, business technology
    
Webroot
Published By: Silver-Peak     Published Date: Mar 21, 2012
Learn why one of the world's largest media companies chose Silver Peak virtual WAN Optimization for their cross country replication challenges including an on-demand media database requiring 7 x24 access. Complicating the challenge was a lack of space in the state-of-the-art data center for additional physical hardware. No problem.
Tags : 
silver-peak, disaster recovery, wan, technology, optimization, data center, data management, vulnerability management
    
Silver-Peak
Published By: Group M_IBM Q418     Published Date: Sep 10, 2018
IBM LinuxONE™ is an enterprise Linux server engineered to deliver cloud services that are secure, fast and instantly scalable. The newest member of the family, IBM LinuxONE Emperor™ II, is designed for businesses where the following may be required: • protecting sensitive transactions and minimizing business risk • accelerating the movement of data, even with the largest databases • growing users and transactions instantly while maintaining operational excellence • accessing an open platform that speeds innovation
Tags : 
    
Group M_IBM Q418
Published By: SAS     Published Date: Mar 06, 2018
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics, and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. With the right end-user tools, a data lake can enable the self-service data practices that both technical and business users need. These practices wring business value from big data, other new data sources, and burgeoning enterprise da
Tags : 
    
SAS
Published By: Datawatch     Published Date: Mar 21, 2014
Big Data is not a new problem. Companies have always stored large amounts of data—structured like databases, unstructured like documents—in multiple repositories across the enterprise. The most important aspect of big data is not how big it is, or where it should be stored, or how it should be accessed. It’s the efficacy of business intelligence tools to plumb its depths for patterns and trends, to derive insight from it that will give companies competitive advantage in an increasingly challenging business climate. Visualization allows companies to analyze big data in real-time across a variety of sources in order to make better business decisions.
Tags : 
visual data discovery, decision making software, data variety, business analysis, data visualization, big data, business analytics, business intelligence, real-time data, real-time data visualization, real-time data discovery, data variety software, data discovery software, data analysis software, data mining tools, data extraction, data reporting, pdf to excel, business reporting, data solutions
    
Datawatch
Published By: IBM     Published Date: Apr 20, 2017
The growth of virtualization has fundamentally changed the data center and raised numerous questions about data security and privacy. In fact, security concerns are the largest barrier to cloud adoption. Read this e-Book and learn how to protect sensitive data and demonstrate compliance. Virtualization is the creation of a logical rather than an actual physical version of something. such as a storage device, hardware platform, operating system, database or network resource. The usual goal of virtualization is to centralize administrative tasks while improving resilience, scalability and performance and lowering costs. Virtualization is part of an overall trend in enterprise IT towards autonomic computing, a scenario in which the IT environment will be able to manage itself based on an activity or set of activities. This means organizations use or pay for computing resources only as they need them.
Tags : 
data protection, data security, data optimization, organization optimization, cloud management, virtualization, data center, cloud environment
    
IBM
Published By: SAS     Published Date: Aug 28, 2018
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. To help users prepare, this TDWI Best Practices Report defines data lake types, then discusses their emerging best practices, enabling technologies and real-world applications. The report’s survey quantifies user trends and readiness f
Tags : 
    
SAS
Published By: Internap     Published Date: Dec 02, 2014
NoSQL databases are now commonly used to provide a scalable system to store, retrieve and analyze large amounts of data. Most NoSQL databases are designed to automatically partition data and workloads across multiple servers to enable easier, more cost-effective expansion of data stores than the single server/scale up approach of traditional relational databases. Public cloud infrastructure should provide an effective host platform for NoSQL databases given its horizontal scalability, on-demand capacity, configuration flexibility and metered billing; however, the performance of virtualized public cloud services can suffer relative to bare-metal offerings in I/O intensive use cases. Benchmark tests comparing latency and throughput of operating a high-performance in-memory (flash-optimized), key value store NoSQL database on popular virtualized public cloud services and an automated bare-metal platform show performance advantages of bare-metal over virtualized public cloud, further quant
Tags : 
internap, performance analysis, benchmarking, nosql, bare-metal, public cloud, infrastructure, on demand capacity, it management, data center
    
Internap
Published By: SAS     Published Date: Oct 18, 2017
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. To help users prepare, this TDWI Best Practices Report defines data lake types, then discusses their emerging best practices, enabling technologies and real-world applications. The report’s survey quantifies user trends and readiness f
Tags : 
    
SAS
Published By: IBM     Published Date: Dec 06, 2013
Most enterprise datacenters are so cluttered with individually constructed servers, including database servers, that staff time is largely taken up with just the maintenance of these systems. As a result, IT service to business users suffers, and the agility of the enterprise suffers. Integrated systems represent an antidote to this problem. By dramatically reducing the amount of setup and maintenance time for the applications and databases they support, integrated systems enable the technical staff to spend more of their time supporting users and enabling the enterprise to thrive.
Tags : 
ibm, ibm puredata system, transaction, datacenter, data management, integrated system, configuration, consolidation, enterprise applications, servers, productivity, business technology, data center
    
IBM
Published By: Oracle Corp.     Published Date: Mar 05, 2013
While Oracle has long offered Oracle Exadata to serve the needs of larger enterprises, the Oracle Database Appliance now offers that same high availability, service, and reliability to small- and mid-sized markets.
Tags : 
oracle, database, oracle database appliance, oracle exadata, business technology
    
Oracle Corp.
Published By: Alacritech     Published Date: Aug 21, 2009
Customers are increasingly constrained by large transfers of data backup, long backup times to storage devices, and less efficient use of server resources as a result. System and database availability is heavily impacted during backup periods, and with the rapid growth of storage and the need to archive customer data, IT managers are actively seeking solutions for their backup problems.
Tags : 
data backup, backup and recovery, storage, database availability, backup, recovery, servers, backup server, backup servers, data transfer, alacritech
    
Alacritech
Published By: IBM     Published Date: Dec 30, 2008
Research giant Forrester estimates that on average, data repositories for large applications grow by fifty percent annually. However, up to half of all that data can be duplicate or otherwise unnecessary. With no end in sight for the increase of raw data (both structured and unstructured), organizations must be ever more strategic about where and how to store enterprise information. Meanwhile, a tightening economy is putting pressure on costs, just as compliance mandates call for greater visibility into processes and data. What to do? Register for this Web Seminar to learn the four pillars of strategic storage, and how they can be used to simultaneously reduce costs, while improving speed, accuracy and accountability. The four pillars will be discussed in detail, with real-world examples of each: deep compression, database archiving,de-duplication thin provisioning.
Tags : 
strategic storage, ibm, enterprise information, deep compression, database archiving, de-duplication, thin provisioning, knowledge management, storage, data management
    
IBM
Published By: Red Hat     Published Date: Jan 20, 2011
In this session, attendees will explore the new features of Red Hat Enterprise Linux 6 performance. John Shakshober (Shak) will share the scalability optimizations for larger SMP systems, > 1TB of memory, new transparent hugepage support, multi-q network performance of 10Gbit and Infiniband, and KVM enhancements that went into Red Hat Enterprise Linux 6. Shak will also share benchmark data from industry standard workloads using common applications such as database servers, Java engines, and various financial applications on the latest Intel and AMD x86_64 hardware.
Tags : 
red hat, enterprise, linux 6 performance, memory, storage, database servers
    
Red Hat
Published By: Datastax     Published Date: Aug 15, 2018
In this eBook, we’ll take a look at why DataStax and Azure combine to make the ideal hybrid operational cloud database for the modern application needs of many of our large enterprise customers. The eBook explains how digital disruptors like Microsoft, Komatsu, and IHS Markit are leveraging the DSE+Azure hybrid cloud database to build game-changing applications built for the Right-Now Economy.
Tags : 
    
Datastax
Published By: Datastax     Published Date: Aug 27, 2018
Graph databases have the power to see deeply into real-time data relationships and make it easy to use relationship patterns for instant insight into large data sets. From IoT to networking to customer 360 to solving business problems with multi-model support, the power of graph can never be understated. Read this white paper to learn the uses cases for graph databases and how graph databases work.
Tags : 
    
Datastax
Published By: Datastax     Published Date: Aug 28, 2018
In this eBook, we’ll take a look at why DataStax and Azure combine to make the ideal hybrid operational cloud database for the modern application needs of many of our large enterprise customers. The eBook explains how digital disruptors like Microsoft, Komatsu, and IHS Markit are leveraging the DSE+Azure hybrid cloud database to build game-changing applications built for the Right-Now Economy.
Tags : 
    
Datastax
Previous    1 2     Next   
Search