Came across this creative presentation on Redis: http://www.slideshare.net/JustinCarmony/blazing-data-with-redis-20 AdvertisementsRead More Excellent presentation on Redis
Recently I was working on implementing a custom IAuthenticator and IAuthority for Cassandra 1.1.1 because really there is not much/any security out of the box. For those of you familiar with Cassandra, its distribution used to include a simple property file based implementation of the IAuthentication and IAuthority that you could reference in your cassandra.yaml file […]Read More Astyanax -> Cassandra PoolTimeoutException during Authentication failure?
Today I pushed up some source to Github for a utility I was previously working on to load data from USPS AIS data files into HBase/Mysql using Hadoop mapreduce and simpler data loaders. Source @ https://github.com/bitsofinfo/usps-ais-data-loader This project was originally started to create a framework for loading data files from the USPS AIS suite of […]Read More USPS AIS bulk data loading with Hadoop mapreduce
I recently started playing around with Redhat’s Openshift PaaS and installed the MongoDB and RockMongo cartridges on my application. My use case was just to leverage the Openshift platform to run my MongoDB instance for me, and I really was ready (nor needing) to push an actual application out to the application running @ openshift; […]Read More How to access your OpenShift MongoDB database remotely on OS-X
Came across this excellent research paper, maybe a bit biased at times however it really gives a great overview of NoSQL, different implementations and great comparisons with RDBMs’. Check it out at: http://www.christof-strauch.de/nosqldbs.pdfRead More NoSQL databases – excellent research paper
While working on a project where I needed to quickly import 50-100 million records I ended up using Hadoop for the job. Unfortunately the input files I was dealing with were fixed width/length records, hence they had no delimiters which separated records, nor did they have any CR/LFs to separate records. Each record was exactly […]Read More Reading fixed length/width input records with Hadoop mapreduce
Ok, so today I needed to get HBase 0.20.0 running on my local os-x box, simply in standalone mode. I am starting a project where I need to manage 50-100 million records and I wanted to try out HBase. Here are the steps I took, the steps below are a consolidation of some pointers found […]Read More HBase examples on OS-X and Maven