Amazon EC2


  • How does deduplication in cloud computing work and is it beneficial?

    The deduplication process reduces the amount of data in a storage system, but dedupe in the cloud may be more valuable to the cloud provider than the customer.Continue Reading

  • Staring down the security issues in cloud computing

    Adoption of cloud computing has no doubt been slowed by worries about the security of those out-of-sight servers and resources. While reasonable, those worries have given way over time to acceptance and even optimism.

    In this month's Modern Infrastructure cover story, TechTarget’s Trevor Jones writes about why security issues in cloud computing are not the impediment to adoption that they once were. In fact, some organizations are coming to the conclusion that their workloads run more securely in a public cloud than in an on-premises environment. Cloud service providers possess security expertise and experience at levels that aren’t as readily available on a typical IT staff, and certifications give confidence that providers can actually do all they claim to do.

    The public cloud is not a risky place to do business, though its safeguards are different from those you've put in place to protect your own data center. These differences are most clearly seen when looking at the shared-responsibility model, which addresses many of the security issues in cloud computing. Users and providers must each do their part. Otherwise, the risks will become apparent, and cloud computing won't be what you need it to be.

    This issue also looks at how a new wave of products in development brings memory and storage technologies closer together. Nonvolatile dual-inline memory modules, for example, combine the speed of memory with the persistent qualities of storage in some interesting ways. Also included is an article on how some IT shops that have adopted flash storage are simultaneously impressed and disappointed with the results. Costly flash products invariably improve performance, but they won't solve every problem or clear every bottleneck.

    Continue Reading

  • Internet of things data security proves vital in digitized world

    Securing IoT data should become a priority as more companies manipulate the volumes produced by these devices. Seemingly innocuous information could allow privacy invasions.Continue Reading

  • Metadata management tools help data lake users stay on course

    Effective metadata management processes can prevent analytics teams working in data lakes from creating inconsistencies that skew the results of big data analytics applications.Continue Reading

  • BI self-service needs new thinking to truly serve business users

    To be truly useful to a broad set of business users, self-service BI tools need to become easier to use. An increased focus on metadata and artificial intelligence can help with that.Continue Reading

  • New options to evolve your data backup and recovery plan

    The server backup market first evolved to protect VMs, but now it's undergoing another transformation. Find out how it's evolved and what you need to know for the future.Continue Reading

  • Smart storage systems smart for business

    Mike Matchett explains how data-aware storage combined with application awareness is leading to a new wave of intelligent data storage.Continue Reading

  • Metadata injection marks Pentaho big data pipeline

    The crush of big data leads some data pros to seek more automation of data integration processes. The Pentaho software platform now offers metadata injection capabilities to help meet such needs.Continue Reading

  • Push your digital asset management strategy beyond metadata

    Digital asset management needs to go beyond metadata to build and maintain branded digital experiences that include mobile and rich media. An expert offers tips on how to begin.Continue Reading

  • Why data lake governance is key to modern data architecture

    Data lakes, or storage repositories that can hold vast quantities of raw data and can be queried for data analysis, have offered businesses a more flexible alternative to traditional warehouses. But as some companies that have implemented them have found, these repositories can fail to deliver if data lake governance doesn't come into play: Businesses could fail to produce meaningful business intelligence, or they could even jeopardize the business.

    In this issue of CIO Decisions, Senior News Writer Nicole Laskowski lays out how to walk the fine line between data lake governance and taking advantage of the flexibility of data lakes. Plus, we unpack how a tools and knives manufacturer overcame data center performance issues when upgrading its ERP system; explore how Dow Jones' chief innovation officer approaches the business innovation process; and examine whether advances in robotics will enable robots to help humans innovate.

    Continue Reading

  • A smart organization's guide to VM sprawl management

    VM sprawl plagues the virtualized environments of many organizations. The key to overcoming it is incorporating automation and orchestration in your sprawl management plan.Continue Reading

  • Six reasons IoT storage should be object-based

    Internet of Things data requires a storage system that can scale, as well as support analytics projects. Find out how object storage systems meet those needs.Continue Reading

  • Data lake governance: A big data do or die

    Data lake governance may not be a sexy undertaking, but it's become a critical component in modern data architecture, according to experts.Continue Reading

  • Contextually-aware content amps up the online experience

    Contextually aware content can enrich online experiences by anticipating user needs, but it also demands that users have a firm grasp of the customer journey and information architecture already in place.Continue Reading

  • Data awareness gives storage pros new insights

    Early adopters of data-aware storage arrays see better management and analytics, as well as compliance assurance from the technology.Continue Reading