Expert Insights
Technical Blog


Technical and Product News and Insights from Rackspace

MongoDB security tips

Getting started with MongoDB® is easy. However, you can run into several hiccups with its new features that emerge on an ongoing basis. One such area of concern is security, which is the focus of this blog.

Read More

MongoDB tips: Part 1

Originally published by Tricore: Aug 2, 2017

While it’s easy to get started with MongoDB, more complex issues emerge when you’re building applications. You may find yourself wondering things like:

  • How do I re-sync a replica member in replica set?
  • How can I recover MongoDB after a crash?
  • When should I use MongoDB’s GridFS specification to store and retrieve files?
  • How do I fix corrupted data?

This blog post shares a few tips for handling these situations when you’re using MongoDB.

Read More

MongoDB tips: Part 2

Originally published by Tricore: Aug 24, 2017

In Part 1 of this series, we shared some tips for using MongoDB. In Part 2, we cover several more MongoDB topics, including optimization, performance, speed, indexing, schema design, and data safety.

Read More

Hadoop ecosystem basics: Part 1

Originally published by Tricore: July 10, 2017

Apache™ Hadoop® is an open source, Java-based framework that’s designed to process huge amounts of data in a distributed computing environment. Doug Cutting and Mike Cafarella developed Hadoop, which was released in 2005.

Built on commodity hardware, Hadoop works on the basic assumption that hardware failures are common. The Hadoop framework addresses these failures.

In Part 1 of this two-part blog series, we’ll cover big data, the Hadoop ecosystem, and some key components of the Hadoop framework.

Read More

Hadoop ecosystem basics: Part 2

Originally published by Tricore: July 11, 2017

In Part 1 of this two-part series on Apache™ Hadoop®, we introduced the Hadoop ecosystem and the Hadoop framework. In Part 2, we cover more core components of the Hadoop framework, including those for querying, external integration, data exchange, coordination, and management. We also introduce a module that monitors Hadoop clusters.

Read More