10
Amitav Swain - Big Data Architect / Data Scientist SUMMARY: Fourteen years of It experience including Ten years of experience as Data Architect in designing and developing Data platforms, Modeling Data warehouses, including Big Data eco-system including Hadoop (both HDP and CDH), DSE Cassandra (Community Apache & Datastax flavors), Hbase,Hive, Pig, Sqoop, Oozie, MR, Flume. Expertise on evaluating and designing systems interfaces using ORM frameworks such as Mybatics /Hibernate etc... Extensive Report development experience on tools such as Jasper Reporting/OBIEE(11g) and Tableau. Cloud Expertise mostly on AWS spanning across deployment of spot EMR for HPC systems to setting up plain Hadoop clusters for big-data exploratory data system systems using EC2 instances. Experience installing monitoring systems such Ambari on AWS to monitor cluster health and cluster management. Wide scan implementation knowledge of AWS services such as EC2, EMR, SNS, ELB, Network components VPC, Subnets, gateways etc Experience in designing and developing Real-time ingestion big-data architectures using STORM, Kafka, Cassandra/Hbase and Spark Streaming. Undertaken senior co-ordination roles involving managing large teams and end-users base for development, production support and requirements gathering across multiple projects. Extensive experience in data warehouse design, migration, data pipeline developments, ETL design (Informatica/Pentaho), Data Analytics platform, and Data validation framework. Expertise in POC by using apache spark and scala on the Service layer using lambda architecture on Trellis analytics platform. Experience building and implementing Lambda architecture for data ingestion pipelines. Hands on complex RDBMS SQL /HIVE HQL Experience in managing and reviewing Hadoop log files. Ability to analyze and performance tune a Hadoop cluster. Expertise on OLTP/OLAP/MOLAP data modeling. Worked as a Scrum master on Scrum Dev elopement life cycle. Expertise in HIVE HQL / (RDBMS) Oracle SQL/PLSQL performance tuning. Hands on development experience with RDBMS, including writing complex SQL queries stored procedure and triggers. Helped Support team for writing Complex IMPALA query on Production box. Helped Business Analyst for Generating Analytical Report (POC) developed in Tableau. Hands on Data parsing using Avro and Perquet format. Recognized multiple times for the ability to manage people and projects to channel positive outcomes. Contributed as a Lead Architect in Wipro technology. Specialized in Applications, Data Migrations and Integrations. Quite strong Problem solving, troubleshooting, and Analytical skills. Other skills include Client Relations, Business Development, Technical/functional Analysis, Team management & co-ordination skills; Speaker in Technology conferences, seminars, workshops TECHNICAL SKILLS: NOSQL Database: MONGODB (NOSQL), HBASE,Cassandra

Amitav Swain.Big Data Architect Data Scientist

  • Upload
    amit-s

  • View
    305

  • Download
    6

Embed Size (px)

Citation preview

Page 1: Amitav Swain.Big Data Architect  Data Scientist

Amitav Swain - Big Data Architect / Data Scientist

SUMMARY: Fourteen years of It experience including Ten years of experience as Data Architect in designing and developing Data

platforms, Modeling Data warehouses, including Big Data eco-system including Hadoop (both HDP and CDH), DSE Cassandra (Community Apache & Datastax flavors), Hbase,Hive, Pig, Sqoop, Oozie, MR, Flume.

Expertise on evaluating and designing systems interfaces using ORM frameworks such as Mybatics /Hibernate etc... Extensive Report development experience on tools such as Jasper Reporting/OBIEE(11g) and Tableau. Cloud Expertise mostly on AWS spanning across deployment of spot EMR for HPC systems to setting up plain Hadoop

clusters for big-data exploratory data system systems using EC2 instances. Experience installing monitoring systems such Ambari on AWS to monitor cluster health and cluster management. Wide scan implementation knowledge of AWS services such as EC2, EMR, SNS, ELB, Network components VPC, Subnets,

gateways etc Experience in designing and developing Real-time ingestion big-data architectures using STORM, Kafka, Cassandra/Hbase

and Spark Streaming. Undertaken senior co-ordination roles involving managing large teams and end-users base for development, production

support and requirements gathering across multiple projects. Extensive experience in data warehouse design, migration, data pipeline developments, ETL design

(Informatica/Pentaho), Data Analytics platform, and Data validation framework. Expertise in POC by using apache spark and scala on the Service layer using lambda architecture on Trellis analytics

platform. Experience building and implementing Lambda architecture for data ingestion pipelines. Hands on complex RDBMS SQL /HIVE HQL Experience in managing and reviewing Hadoop log files. Ability to analyze and performance tune a Hadoop cluster. Expertise on OLTP/OLAP/MOLAP data modeling. Worked as a Scrum master on Scrum Dev elopement life cycle. Expertise in HIVE HQL / (RDBMS) Oracle SQL/PLSQL performance tuning. Hands on development experience with RDBMS, including writing complex SQL queries stored procedure and triggers. Helped Support team for writing Complex IMPALA query on Production box. Helped Business Analyst for Generating Analytical Report (POC) developed in Tableau. Hands on Data parsing using Avro and Perquet format. Recognized multiple times for the ability to manage people and projects to channel positive outcomes. Contributed as a Lead Architect in Wipro technology. Specialized in Applications, Data Migrations and Integrations. Quite strong Problem solving, troubleshooting, and Analytical skills. Other skills include Client Relations, Business Development, Technical/functional Analysis, Team management & co-

ordination skills; Speaker in Technology conferences, seminars, workshops

TECHNICAL SKILLS:NOSQL Database: MONGODB (NOSQL), HBASE,CassandraBig data framework: Apache Yarn/CDH4 Hadoop/HDP 2.2 (HDFS/Mapp reduce), RHADOOP framework.Databases (RDBMS): Oracle10g/11g, MYSQL,Scripting language: HIVE.14 (HQL), PIG (CDH4 Hadoop),CDH4 Impala Unix, Python –Numpy utility .Operating Systems: UNIX and MS DOS 6.22ETL Tools: Informatica 8 PC /Composite 4.6/ Pentaho Data Integration 4.2 PDI open sourceSoftware tools: (RDBMS)SQL, Oracle PL/SQL, SQL*Loader, Unix Shell script, C, C++, MS Visual Basic 6.0,

COBOL, Developer 2000 (Form6.0, Report 6.0), Crystal Reports 8, SQL tool as Toad.Analytics /Reporting tool: R, SSAS, MDX query, OBIEE (10g/11g), Jasper I report 4.7/5.1/5.2,DataMeer 2.1,

Tableau,SSRSProgramming Language: Java core, Python, R.Methods: Oracle AIM (Application Implementation Methodology).Build Tools: Maven 3.1Cloud Configuration: Big datasetup onAWS EnvironmentWeb Development Tool: HTML, XML (XSL and CSS).ERP Application: Oracle 11i Application (11.5.0) Lease module,Application Server: Oracle 9i Application ServerTesting Tool: Quick Test Pro (QTP)Data modeling Tool: Erwin, open spear.

Page 2: Amitav Swain.Big Data Architect  Data Scientist

Amitav Swain - Big Data Architect / Data ScientistDomain Expertise:Financial Domain: Key areas of interest is Investment banking/Future and option/Cash flow /Equity /General

ledger etc.Data centre Management: Trellis Data Centre automation. (Inventory Management and Site management )SCM (Supply chain management): Forecasting on Order planning / Factory planner/ Inventory /Master sales plan

(MSP), Master Requirement plan (MRP)/Dell Procurement process flow.Original Design Manufacturer(ODM) automation etc.

Latest Publications:(Feature Selection and Classification of Microarray Data using MapReduce based ANOVA and K-Nearest Neighbor)http://www.sciencedirect.com/science/article/pii/S1877050915013599

EXPERIENCE:

Big Data Architect / Data Scientist Docmation LLC Feb 2016 to PresentProject: UNITED HEALTH GROUPUHG is provided a data lake platform so that it act as a data as service for the end consumer and also provided the Predictive modeling platform for adhoc and accurate data analysis and enhance the business process. Job Description/Duties: Provided the POC for the big data by Setting 11 node HDFS cluster in EC2 instances. Designed the job by which was Scooping/Loading data from Oracle (Different Drug vendor information) and load into

the HDFS cluster. Provided POC to load the data into the Cassandra layer by integrating with Apache Spark. Designed Storm Parser code which pulls the data from the Kafka and loaded into the Cassandra. Proposed, White boarded, Brain storming sessions on proposed solution options, including but not limited to SAP HANA

Vora 1.2, Apache Spark 1.6.1, Hadoop 2.7, Hbase 1.1, Cassendra 3.x, AWS/EMR, Kafka 0.9, Oracle 12c, Talend 5.6 ETL Using the proposed options, ran through critical Use cases that Client has that included but not limited to Live alerts, Real

time Tracking and Reporting, Predictive Analysis based on real time and historical data, Adhoc/Batch reporting etc. Performed Analysis on current Database design, Data fabrics , Architecture, Data flow and proposed what changes needed

to be made in respective areas to improve overall efficiency, scalability and future growth.

Big Data Developer / Architect / Data Scientist CLIENT: T-Mobile Wipro TechnologySep 2015 to Feb 2016T-mobile developed a Single Repository as Data lake which ingest different service provider data from all the router .It helps to identify the usage of the Service across different customer and identify the frequency of call drop in particular reason so that enhance the bandwidth across the different reason at USA and capture the Academic and corporate sector share.Responsibilities: Provided the POC for the big data by Setting 36 node HDFS cluster. Involved Data model design process (OLTP / OLAP). Neteeza (Different vendors DB node) to the HDFS (HDP2.2) cluster. Mentoring Developer’s for the Optimization of Oracle (stored proc / package / Trigger / Transformation using Pentaho

and Java API). Designed the job by which Scooped / Loading data from. Designed HBASE loader to load the data into the hive Tuning the pipeline load process from the legacy system (Oracle / logfile / hdfs) to the Cassandra database. Designed the data extraction layer on Hive layer. Designed Storm Parser code which pulls the data from the Kafka and load into the Hive stage and HBase and querying

using Spark SQL. SQL Query tuning. Involvement to write Complex OLAP SQL query on Oracle DB. Reviewed Code Review and Unit testing.Cluster Specs: Data Size: 27TBCluster Architecture: Fully Distributed Package Used: HortonworksCluster Capacity: 40 TB approx.

Page 3: Amitav Swain.Big Data Architect  Data Scientist

Amitav Swain - Big Data Architect / Data ScientistNo. of Nodes: 8 Data Nodes + 1 Masters + 1NFS Backup For NN+1Edge node

Big Data Architect Wipro TechnologyApr 2015 to Sep 2015India, US Data ingestion and Automation platform for Zurich bank.Main goal to capture all the features from different source at different product level and optimize the business process across the different reason at USA.Responsibilities: Designed the job by which Scooping / Loading data from Oracle (Different vendors DB node) and load into the HDFS

cluster. Provided the Complex optimized SQL query for the Reporting framework. Helped data scientist for identifying the feature during the Implementation of Logistic regression. Provided Custom-Udf using python scripts. (MR streaming). Involved in Tasking / Estimating of the respective story defined in each sprint. Created and worked Sqoop jobs with bulk & incremental load to populate Hive Tables. Pushed data using Sqoop from Hive

tables to Reporting Database in Oracle. Extensive Architectural & hands experience in Apache Spark 1.1–1.6 & Hive for data transformation and Query. Developed UDFs using both Dataframes/SQL and RDD/MapReduce in Spark 1.3+ for Data Aggregation, queries and

writing data back to OLTP system directly or through Sqoop. Worked extensively with different teams and written Queries, scripts, transformations to Flatten source data and convert

them into functional modules. Some of data was derived from existing Data warehouse in Oracle 11g. Extensively worked on Troubleshooting & Debugging Hadoop errors, Ingestion failures and worked on building robust &

fault tolerant scripts & architecture to prevent future failures and exceptions. Cluster Specs: Data Size: 27TBCluster Architecture: Fully Distributed Package Used: Horton works (HDP2.2)Cluster Capacity: 40 TB approx.No. of Nodes: 30 Data Nodes + 2 Masters + 2NFS Backup For NN+2Edge node

Big Data Architect AltisourceIndiaMar 2014 to Mar 2015Automation and generation of Cash Report/Claim Report to the End customer (FNMA) and provided a platform for the business user for predictive analysis.Responsibilities: Provided Best design approach and added the new module for generating Cash Report / Claim Report to the End customer

(FNMA). Provided the POC for the big data by Setting 40 node HDFS cluster. Provided the Complex SQL query for the ETL and reporting framework Involved Data model design process (OLTP / OLAP). Provided the Complex SQL query for the ETL and reporting framework Designed the job which Sqooping / Loading data from Oracle (Different vendors DB node) and load into the HDFS

cluster. Involved in tasking / Estimating of the respective story defined in each sprint. Involved Data model design process (OLTP / OLAP). Enhanced the Predictive data model by successful integration of Hadoop and R which helps to avoid the loan delinquent

alert in the Mortgage industry (Kmn algorithm). Integrated Reporting (Jasper) with Hadoop platform.Cluster Specs: Data Size: 8TBCluster Architecture: Fully Distributed Package Used: CDH4Cluster Capacity: 39.5 TB ~ 40 TB approx.No. of Nodes: 8 Data Nodes + 2 Masters + NFS Backup For NN

Page 4: Amitav Swain.Big Data Architect  Data Scientist

Amitav Swain - Big Data Architect / Data ScientistData ArchitectEmerson network powerIndiaFeb 2013 to Mar 2014Project: Trellis Data center Analytics (Release 2.2) Trellis (Data Center Solution) is a cloud based IOT environment. It helps to Mature Trellis at organizational level, which providing the cloud based Data center framework for Small / Midsize vendor and managing their data center remotely.Responsibilities: As a Data architect provided Best approaches to the Management to migrate the Trellis (Data Center Solution) to the cloud

environment. It helped to Mature Trellis at organizational level, which providing the cloud environment for Small / Midsize vendor and managing their data center remotely.

Gather the requirement for the Product owner of the Datacenter... Provided the POC for the big data by Setting Single node HDFS cluster. Designed/Coding the job by which Sqooping/Loading data from Oracle (Different vendors DB node) and load into the

HDFS cluster. SQL Query tuning. Involvement to write Complex OLAP SQL query on Oracle DB. Involved Data model design process (OLTP / OLAP). Involvement to write Complex OLAP SQL query on Oracle DB Captured massaged data in the Mongo DB database layer for some of the Audit Report across all the Datacenter. Generate the Audit Report based on the Different features added to respective Devices using MongoDB documented

Database. Tuned the pipeline load process from the legacy system (Oracle / logfile / hdfs) to the Cassandra database. Designed/Coding Map Reducer job (using Pentaho ETL tools and Java API). Gather the requirement for the Product owner of the Datacenter... Written scripts for extracting & Ingestion of Drilling Sensor Data from Oracle database. Built Scripts (Stored Procedures) in Oracle database. Created Oozie 3.3 job to pull Flat data into Hive Tables. Created Hive queries based on Business Requirements and pushed the summarized output in Flat Files.

Senior Data Architect Emerson network powerAug 2012 to Feb 2013Project: Trellis Data center Analytics (Release 2.2)Responsibilities: As a Senior Data architect provided the OLAP solution for the Trellis product which is helpful for the optimization of data

center management. Mentoring Developer’s for the Optimization of Oracle (stored proc / package / Trigger / Transformation using Pentaho

and Java API). Mentoring Developer for SQL Query tuning and tuning the application. Integrated the Jasper Report with Oracle / HDFS to generate the Audit level reporting also it reduces the complex

transformation from the LOG file in the Reporting Database (Oracle RDBMS). Involving for the Designing of the OLAP data model. Developed a Pentaho ETL layer (With Java API) as a map reducer to process huge log file and load into the HDFS on the

daily batch job. Report Design / Coding by using JASPER tool and enabled various features (Adhoc Reporting using Topic / Domain) by

using jasper tool. Enabled the jasper Caching mechanism in the Trellis Product so that Report turnaround time is improved.

Initial stage of POC on Trellis load the real time data from the different sensor to the Cassandra database using Kafka and Storm. Store the Data at different time series version.

By Matching Process matching the data flow from the RDBMS (using Sqoop) and the data flow from the sensor or log data which is real time in nature and load into the Cassandra common repository.

Define the Cassandra keyspace and Column family based on the Domain specific. Responsible for building scalable distributed data solutions using Datastax Cassandra. Tuning the pipeline load process from the legacy system (Oracle /logfile /hdfs) to the CassandraEnvironment: Oracle SQL PL / SQL (Oracle 11g,), HDFS (CDH4) ,Erwin, Pentaho PDI ETL tool, Java ,Jasper OLAP server, jasper report, Quick test pro.

Data Architect

Page 5: Amitav Swain.Big Data Architect  Data Scientist

Amitav Swain - Big Data Architect / Data ScientistNetApp IndiaMar 2011 to Jun 2012Responsibilities: As a Data architect (MTS-3 role provided the OLAP solution for the Burt matrix reporting system. Burt system is

analytical reporting solutions which pull the data from the OLTP Burt system (Enhancement / Bug) raised by the Customer and this system provide the implementation / resolution accuracy. This is a weekly trending reporting system; by this top level management can define the time line to add the new feature to the different release of the ONTAP product.

Gather the requirement from the Burt user (ONTAP Release Manager). Provided the POC and based on that define the tool which will satisfy the user requirement. Mentoring Developer’s for the Optimization of Oracle (stored proc / package / Trigger). Mentoring Developer for SQL Query tuning and tuning the application. Development /Customization of Pentaho ETL using Java API. Designed the physical OLAP data model. Provided the Coding methodology to the Development team. Follow-up with the Deployment team. User training and implementation.Environment: Oracle SQL PL / SQL (Oracle 10g,), Erwin, Pentaho PDI ETL tool (Java), OBIEE (10g), Quick test pro.

Senior Database ArchitectNetApp IndiaDec 2010 to Mar 2011Responsibilities: As a senior Database architect working NetApp quality dashboard for the end user .So that if any new feature will be

adding in future it can have a proper release process. Integration of Quality process and the RPM (NetApp Resource Planning Management) so that proper resource utilization can be optimized.

As a scrum master prepared the estimation of the project i.e. Sprint estimation Performed data modeling for Net app RPM database. Prepared Technical design Document. Mentored Developer’s for the Optimization of Oracle (stored proc / package / Trigger). Mentored Developer for SQL Query tuning and tuning the application. Optimization of OLAP model (Optimization of .RPD and Physical / BMM layer) Mentored developer for Coding and Testing for PL / SQL / Informatica work flow Prepared test cases. Involved performance testing for the data loading into the ROLAP model. Performed code reviews End to End data modeling for the RPM and Quality process. User training and implementation.Environment: Oracle PL / SQL (Oracle 10g,), Erwin, Informatica pc, OBIEE (10g), Quick test pro.

Database ArchitectDELL IndiaNov 2008 to Dec 2010Responsibilities: Worked for Dell procurement services. Involved hands on supply chain business flow. Involved in analytics engine which produced the MIS spending report by taking MRP (Material requirement plan) and

Material Cost as measures. The engine helped user to leverage to get the forecasted detail information for a PART / SKU / MODE across the Region /

ODM (Original Design Manufacturer) based on past 1year and preparing estimation rolling future 6 months. So by this User (Michel Dell / Robin Johnson) controlled the entire manufacturing / Supply chain system.

Followed agile Methodology Worked as scrum master preparing the estimation of the project i.e. Sprint estimation Performed data modeling for Dell Procurement database. Prepared Technical design Document. Coding and Testing for PL / SQL / SSAS / Informatica work flow. Developed JavaScript’s to automate the build process. Prepared test cases. Involve performance testing for the data loading in to the SSAS cube.

Page 6: Amitav Swain.Big Data Architect  Data Scientist

Amitav Swain - Big Data Architect / Data Scientist Performed code reviews Involved in Planning and carrying out projects in software development, integration and process improvements with the

MRP (Material Requirement plan) / MSP (Master sales plan). End to End data modeling for the ODM (Original Design Manufacturer) PO automation and reporting solution. User training and implementation.Environment: Oracle PL / SQL (Oracle 10g,), Erwin, JAVA scripts, Composite 4.6, Informatica pc, SSAS, SSRS, Quick test pro.

Data ArchitectDeutsche Bank Insurance, UK (HCL Tech) Aug 2007 to Nov 2008Responsibilities: Involved in the design, development, testing, and integration of DARE database. DARE processes all daily and monthly

reporting requirements for the Business Area Controlling (Germany) CIB S&T. To ensure clean data, Debt Securities Accounting Services (DSAS) use DARE as their main automated reconciliation platform on a daily basis. Another main focus of DARE is data pooling and historical & performance.

Worked Consistent for data model for all products covered in BAC GM for centralized data pool Interfaces to all necessary source systems, which provide positions / transactions / market data Automated quality checks of input data Standardized management reports via Oracle Reports, HTML output with exporting functionalities into XLS / CSV / PDF

and Oracle Discoverer Reports Batches extract functionality of data as flat files. Automated batch processing controlled by end-users with selectable frequency at any point of time in the future; auto

repeat and e-mail notification; Log status and detailed messages. Coding and Testing using PL / SQL.Environment: Oracle MOD PL / SQL, Toad, Oracle 9i / 10g, Java Script, HTML.

Senior Oracle Developer / AdministratorCLIENT: Mansion Betting System, UK(Tech Mahindra)Jun 2006 to Jul 2007Responsibilities: Involved in the design and migration of the Mansion Betting System, this is an alternative front-end to the online web

application for the Sports Book and Exchange. The purpose of this approach is to reduce bandwidth utilization and provide a future ability to add Player focused functionality and push content to Mansion Players.

SQL Query tuning. Requirement gathering and performing data modeling for sports book and betting server module using Together

architecture tool Migrated data using SQL Loader utility Optimized of Oracle (stored proc / package / Trigger). BMM / physical layer optimization while (ROLAP model) repository creation. Coding and Testing using PL / SQL Performed query tuning for the Mansion sports book production database using Explain plan Created Unix shell scripts by imposing Oracle 10g advanced feature as Flashback technique in the Mansion test database Environment: Oracle PL / SQL, OBIEE (10g), together architecture data modeling tool, MS Windows XP 2002, and HP UNIX.

Oracle Developer / AdministratorCLIENT: Nepal Bank Insurance Limited (NBL) 3I-InfotechJan 2005 to Dec 2005Responsibilities: Involved in the Nepal bank Limited (NBL) migration project. My responsibilities included: Migrated and validated the data using External table, Informatica (PC 6.2) tool, and transformation techniques Taking online database backups and recovering them Used log miner technique for retrieving data from the archive log file during archive process Tuned the query for the NBLMIG production database using Explain planEnvironment: PL / SQL, UNIX, Informatica 6.2 PC, MS Windows NT, and HP-Unix

Oracle DeveloperKey Equipment Finance (KEF),Ireland, UK

Page 7: Amitav Swain.Big Data Architect  Data Scientist

Amitav Swain - Big Data Architect / Data ScientistOracle CorporationAug 2004 to Dec 2004

Involved in the development of the Key Equipment Finance (KEF). Customized the CUSTOM.PLL With ZOOM functionality Developed MD70 designs documentation Designed Custom tables to hold contract level and line level additional information Generated the test and installation file by Unix shell scripts Generated reports (AR Invoice) using Oracle 6i Report tool in Oracle 11i Application environment Generated Excel interface by writing macro for filtering data from staging table to specific business table Environment: SQL Loader, PL / SQL, Forms and Reports (6i), UNIX, Excel Interface (macro), MS Windows NT, and HP UNIX.,

Oracle Developer / AdministratorProject: - City of Memphis, Oracle Corporation, US Apr 2002 to Jul 2004Responsibilities: Involved in the migration of data into an Oracle database for analyzing historical information for the different asset

management process. Prepared technical design document MD70 using Oracle AIM Standard Developed the interface using PL / SQL packages Migrate the data and Involved in the data loading process. Developed installation file in UNIX box. Created ldt’s of the concurrent program in the first stage of implementation.Environment: Oracle11i Application (11.5.0) / 10g database, PL / SQL, Developer2000 (Forms and Reports 6i), SQL Loader, UNIX, and Excel Sheet with macro

Oracle DeveloperModule-2 USGYPSUMResponsibilities:As Oracle Developer, involved in the development and testing of the USGYPSUM. My responsibilities included:Understood the flow of the AP moduleWrote test cases using the Oracle AIM standard document TE20Generated the QTP script and making it dynamic by adding different level of validation of various modulesEnvironment: PL / SQL Script, Oracle 11i Application, MS Excel sheet with Macro, QTP 6.8, UNIX, MS Windows NT, and HP UNIX.

EDUCATION: Masters in Computer Application, Sambalpur University, India, 2002 Bachelor in Science), Sambalpur University, India, 1999. Diploma in Computer Application, 1998-99, NIIT PhD Enrollment is in progress on Distributed computing.

LATEST PUBLICATIONS:(Feature Selection and Classification of Microarray Data using MapReduce based ANOVA and K-Nearest Neighbor)http://www.sciencedirect.com/science/article/pii/S1877050915013599

AWARDS AND ACHIEVEMENTS: Best Employee Award fir successful Execution of Reporting framework in Trellis Datacenter Analytics Environment Best Individual contributors (Domain / Technology for the successful implementation of Plato in DELL procurement

space. Best Performance award for successful implementation of Finance project “City of Memphis from Oracle NAIO Division”. Presented & published a paper on 'Grid Computing on replica Management' at NIT (formerly REC, Rourkela), Rourkela.

(National Basis student Convention 'INFOFIESTA' Rourkela CSI chapter). Participated in National Basis Seminar 'Manage IT' at RIMS (Rourkela Institute of Management Studies), Rourkela, India CSI Student membership.