Skip to main content

Hive Example

Use Case :

A super market would record the sales in a file . Whenever an item is sold , the name of item , number of units of sale and cost of each unit in a comma separated format.

A sample file would look like below

Apple,10,10
Mango,20,5
Guava,10,3
Banana,30,4
Apple,10,5

At the end of the day we are required to find  the total sales per each item.

Expected Output :

Apple 150
Mango 100
Guava 30
Banana 120

Implementing in HIVE

Getting started with HIVE:

Open a terminal and type hive , this will open the hive shell

Create and use sales database :

Create database :

create database salesdb;

Use the database :

use salesdb;

Create sales table:

CREATE TABLE ITEM_SALES_RECORD (
ITEM_NAME string,
UNITS int,
UNIT_COST decimal)
 ROW FORMAT DELIMITED
 FIELDS TERMINATED BY ","

 LINES TERMINATED BY "\n";

NOTE : Table names and column names are not case sensitive.

Insert data into table from file:

Use the java file to generate data

https://github.com/nachiketagudi/Map-Reduce/blob/master/GenerateData.java

an input file will be generated at /home/cloudera/input.txt

We can load the data into table in two ways , by keeping the file in the local file system or to push this file to hdfs.

Loading from local file system :

load data local inpath '/home/cloudera/input.txt' into table item_sales_record;

Loading from hdfs :

push the file to hdfs - hadoop fs -put input.txt


load data inpath '/home/cloudera/input.txt' into table item_sales_record;

Obtaining the output:

Use the below query just like yout SQL query to obtain the desired output from table :

SELECT ITEM_NAME,SUM(UNITS*UNIT_COST) FROM ITEM_SALES_RECORD GROUP BY ITEM_NAME;

This will launch map-reduce jobs and provide us the output.





Comments

Popular posts from this blog

Let us 'Sqoop' it ! .

SQOOP - The bridge between traditional and novel big data systems. By now,we have seen articles about MapReduce programs to write programs using Hadoop MapReduce framework.However, all the operations were actually performed on sample text files. Here comes Sqoop to our rescue.Apache Sqoop is a tool developed to fetch/put data from traditional SQL based data storage systems like MySQL,PostgreSQL,MS SQL Server.Sqoop can also be used to fetch/push data from NoSQL systems too.This versatility is because of Sqoop's architecture abstracted on MapReduce framework. Sqoop has an extension framework that makes it possible from and to any external storage system that has bulk data transfer capabilities.A Sqoop connector is a modular component that uses this framework to enable Sqoop imports and exports.Sqoop comes with connectors for working with a range of versatile popular databases including MySQL,PostgreSQL,Oracle,SQL Server,DB2 and Netezza.Apart from the above connectors Sqoop als...

Cloudera setup

Installing Cloudera is a best way to kick start the cloud setup. Follow the below steps to setup Cloudera on your windows machine: 1) Download VMware player to open cloudera machine from your windows machine link :  https://www.vmware.com/products/player/playerpro-evaluation.html Install VMWare player. 2.) Download the Cloudera VM. Do the signup and stuff required to download cloudera VM. link :  http://www.cloudera.com/downloads.html Click on quick starts from the above link , select the latest version and VMWare and click on download. Approximately 5GB of data would be downloaded. So sit back and relax . Upon completion of Clodera VM download , extract the downloaded zip file to a convenient location. Launching the VM 1.) Open the VMWare player and click on open a virtual machine . Open the VM from the path where you have extracted the ClouderaVM .                               ...