Led the efforts to design and build a data warehouse on FACETS OLTP system. Assisted with architecture and planning for SQL Server upgrades, maintenance, replication, and disaster recovery. Prepared ETL design specification documents with information on implementation of business logic and specifications of the job flow. Involved in working with DBA's in assigning the space and privileges for the departmental needs. Used tools: Oracle 8.x, UNIX, Toad, SQL*Plus, designer 2000, EDA/SQL Copy Manager. Used DataStage Manager for importing metadata from repository, new job categories and importing table definitions from database tables. Developed ETL process, integrating incremental data load into Data Warehouse using organization's Pear function library. Deployed the project on Amazon EMR with S3 connectivity for setting a backup storage. Unix. Inherent in the implementation of this architecture are the following aspects of development, each requiring a unique set of development skills: Data modeling. Managed analysis, design, coding and testing of ETL jobs for 7 Source Systems. Installed and configured ODI from seven source systems to an Oracle 11g target. Executed them with Oracle's SQL plus and Teradata's Query man or BTEQ. Created detailed technical specifications and release documentation for assigned ETL and reporting projects. Developed Cognos cube report for data analysis in sales and marketing trends. Designed and implemented stored procedures views and other application database code objects. Loaded and performed some transform data into Hadoop cluster from large set of structured data using TalendBig data studio. Maintained stored definitions, transformation rules and targets definitions using repository manager. Created death registry data model to load the EDW. Extracted Keep The Change module, opt-in-opt-out modules accounting reporting process by utilizing COBOL/JCL/MVS/TIFO/DB2/SQL STORED PROCS technologies. Led both Technical and Sales Team. Used stored procedure transformation inInformatica to execute procedures before/after the target load. Interacted with various business people in External Vendors side and gathered the business requirements and translated them into technical specifications. Developed unit/assembly test cases and UNIX shell scripts to run along with daily/weekly/monthly batches to reduce or eliminate manual testing effort. Created user interface application for automation of migration process (UAT/PROD) . Involved in interaction with users for the requirement analysis and design of user interface. Provided technical assistance for configuration, administration and monitoring of Hadoop clusters. Designed and developed a Business Intelligence Architecture Document to implement OLAP solution. Designed and developed DataStage parallel jobs which involve extraction, transformation and loading of data for separate interfaces. Developed documentation for the procedures. Designed and developed the International Student Office MicroStrategy BI application. Conducted initial data analysis, created technical design documents, coded ETL processes & Fact aggregation (PL/SQL). Created business specific reports using Cognos Impromptu. Worked on syncing Oracle RDBMS to Hadoop while retaining oracle as the main data store. Analyzed old reports designed in crystal reports and transform those reports into SSRS report with proper mapping and error handing. Developed documentation and procedures of refreshing slowly changing in house Data Warehouse dimensional tables. Moved the dataset of reports from OLTP to the new datamart. Involved in writing code using Base SAS and SAS/Macros to extract clean and validate data from Teradata tables. Worked extensively with Sqoop for importing and export data from Oracle. Developed Simple to complex MapReduce Jobs using Hive and Pig. Developed Unix Scripts to run multiple Data Stage jobs in sequence, parallel and to control the dependencies at every stage. Data Warehouse Analysts provide support with various aspects of data warehouse development. Gathered requirements for compilation into functional and technical specifications. Tracked and identified the slowly changing dimensions (SCD), heterogeneous sources, and dimension hierarchies for the ETL process. Defined reference lookups, aggregations, constraints and derivations. Explored Spark API over Cloudera Hadoop YARN to perform different data analysis tasks with the data in Hive. Developed, tested and deployed full code package for EDW in SVN & Visual Source Safe. Extracted documents were stored in XML with content in Base64 Compression to reduce disk storage requirements. Designed lookup strategies using Hash file stage for data extraction from the source systems. Created fully-automated Readmission Predictive Model for classifying and reporting hundreds MEDICARE members with high readmission probability to reduce hospital admission costs. Coordinated with onshore and offshore teams regarding business needs and ensured adherence to business requirements. Involved in processing ingested raw data using MapReduce, Apache Pig and Hive. Visit this link to see more resume skills examples for inspiration. Based on an IBM DB2 data warehouse, the solution will enable business users to self-service their information requirements. Developed Mapreduce programs for data access and manipulation. Properly configuring a data warehouse to fit the needs of your business can bring some of the following challenges: 1. Worked on Data Serialization formats for converting Complex objects into sequence bits by using A JSON, XML formats. Assisted the System Engineering and Network Infrastructure teams during installation, implementation and maintenance of complex databases and applications. Worked on creating MapReduce programs to parse the data for claim report generation and running the Jars in Hadoop. Developed process frameworks and supported data migration on Hadoop systems. Collected the log data from web servers and integrated into HDFS using Flume. Analyzed, researched, documented, and recommended OLAP business intelligence architectures for Nike Enterprise business reporting solutions. Business intelligence is a technology-driven process, so people who work in BI need a number of hard skills, such as computer programming and database familiarity. Coordinated with DBA's and Technology Development staff to manage source system changes. Used TOAD for analyzing data and building indexes. You are about to make a career change? Designed and developed OLAP business models and reports using PowerPlay to analyze budgets and cash flow for Asset Management group. Performed ad-hoc reporting and data clean-up via T-SQL. Data Warehouse. Reviewed and fine-tuned T-SQL stored procedures and triggers before production migration. Data warehousing is a critical component for analyzing and extracting actionable insights from your data. Worked with various transformation types: Lookup, Update, Merge, Joiner, Filter, Sorter and Aggregation. Prepared Unit Test Cases for the Reports.Environment: Data Stage, Windows 2000, Oracle 8i. Involved in installation of Hadoop along with MapReduce, Hive and Pig set up. Involved in identifying job dependencies to design workflow for Oozie & YARN resource management. Drafted requirements, received stakeholder feedback and developed Crystal, Web Intelligence, Desktop Intelligence reports, Xcelsius dashboards and Universes. The data warehouse is the core of the BI system which is built for data … Worked with DBA and GSD for the performance tuning of the SQL queries and MDX queries. Maintained ETL functional specifications, test plans and data for data conversions. Used SQL Profiler, Windows Performance Monitor, Index Tuning Wizard and DB Artisan for troubleshooting, monitoring, performance. Replaced previous workflow process with speed gains up to 200%. Evaluated new technical specifications of Cognos to replace Aperio. Assisted in analyzing incoming equipment and developing the necessary control applications in Linux and Unix. Ported the code for 4 US Insurance products from a VMS to a Linux platform Developer - Web-based product. Sophisticated cube, and OLAP reports have also been developed using Crystal Info. Performed database administration for a SQL Server-based staging environment. Data Analysis. Created jobs in SQL Server Agent, that executes ETL packages daily and writes log and audit information in related tables. Prepared ETL technical details design documents for staging and Data ware house. Identified new technologies or technological improvements to include in the application to improve usability, stability and maintainability. Developed Business Intelligence Application to analyze NHTSA safety standards using SSRS and SSAS 2012. Used sequential file stage as the source for most of the source systems. Experienced in creating and scheduling ETL packages that update OLAP cubes in the Data Mart. Participated in regression testing, and report fixing for conversion over to COGNOS 8. Fine-tuned existing Informatica mappings for performance optimization. Created Sequence in SQL Server 2012 to insert same identity value in multiple dimension tables as per the business rules. Collaborated with the business customers to identify, requirements and create technical specifications. Ensured data integrity, performance quality, and resolution of SSIS data load failures and SSRS reporting issues. Optimized and instituted best practices standards to improve quality and reliability of data used for payments/credits to magazine publishers. Designed specifications for Business Objects Universe using Oracle 11g database environment. Analyzed Java code to implement same logic in Business Objects Universe and reports. Supported implementation and execution of MAPREDUCE programs in a cluster environment. Created a repository in GitHub (version control system) to store project and keep track of changes to files. Created reports in SQL Server Reporting Services (SSRS) Report Builder 2.0. Optimized the T-SQL queries to with the use of SQL Profiler, Indexes and Execution Plans for faster performance. Designed the database using star schema methods to support these OLAP data structures and its dependencies efficiently. Created SSISDB catalog with SQL Server 2012 to support SSIS package deployments in SSMS with Environment variables respective to each environment. Developed data Mappings between source systems and warehouse components. Responsible for delivery of multiple projects and managed revenue of over $18 MM. Designed ODS for Product cost controlling, profitability analysis and overhead cost controlling. Extracted data from Flat Files and XML, applied business logic to load them in the central Microsoft SQL Server database. Created variables and configuration files to make the package more dynamic and schedule jobs to run automatically using SQL server agent. Analyzed financial data using graphs and pictures in SSRS for predicting future data. Involved in production server activities for database development and report sever. Created, maintained and stored manual Test cases, Test plans using Mercury Test Director. Developed Cognos8 reports for the end users to analyze the data in Data warehouse. Involved in scheduling Oozie workflow to automatically update the firewall. Generated XML files, storing blob files in database, currency conversion process using Oracle PL/SQL packages and procedures. Designed Data Quality Architecture Framework for Source Systems Profiling in Mainframe, Oracle, Db2, SQL Server. Designed and developed complex Aggregate, Expression, Filter, Join, Router, Lookup and Update transformation rules. Worked extensively on SPARK using both python and Scala for data analysis. Worked on loading of data from several flat files sources using Teradata MLOAD & FLOAD for market risk. Performed Physical Modeling, ETL, Staging Design, MySQL implementation, OLAP, end user query and report generation. Scheduled and monitored automated weekly jobs under Linux environment. Coordinated with technical teams for installation of Hadoop and third related applications on systems. Integrated NoSQL database like Hbase with Apache Spark to move bulk amount of data into HBase. Developed PL/SQL Packages/Procedures/Functions/Triggers; PL/SQL procedures, SQL queries and Perl Scripts. Involved in creating, maintaining, and supporting Teradata architectural environment. Enhanced the performance of ETL processes to ensure data import and processing occurred within a 3 hour window. Getting a job in the very competitive Data industry necessitates having the right set of tools, such as sharp Data Warehouse skills. Validated and reconciled the data between the source systems to data marts and OLAP. Validated information across HBase and Hive tables containing secure patient information. Loaded data from MySQL, a relational database to HDFS on regular basis using SqoopImport/Export. Developed data reconciliation SQL queries to validate the overall data migration. Developed MapReduce jobs to convert data files into Parquet file format. Analyzed the source systems to identify subject areas, fact and dimension entities. Developed a collection of PL/SQL Packages, SQL scripts and Pro*C scripts to generate the extracts defined in the requirement specification. Prepared test matrix, test data and test cases for SQA team. Performed Analysis of Reference Data, Data sets and Asset classes in order to bring data into the central repository. Designed and developed MapReduce programs. Designed and implemented daily data migration between central data warehouse server and application/reporting database server and analyst database server. Created HBase tables to store various data formats of PII data coming from different portfolios. Used various transformations including XML parser transformations to parse the web log files and load them into oracle. Coded many MapReduce program to process unstructured logs file. Participated in reporting requirements gathering meetings as set forth in legislative mandated procedures/policies. Imported streaming data using Apache Storm and Apache Kafka into HBase and designed hive tables on top. Developed stored procedures and modified triggers using PL/SQL to verify, cleanse, and scrub the data. Employed Compound OLAP methods to join data structures together. Maintained and enhanced several account receivable (AR), application submissions (APPSUB), Log SSAS Business Intelligence solutions. Designed, documented operational problems by following standards and procedures using a software-reporting tool JIRA. Used HBase for scalable storage and fast query. Worked on complex coding in ESQL to capture the required data from different levels of an XML. Prepared migration document to move the mappings from development to QA and then to production repositories. Used Reverse Engineering to Connect to existing database and create graphical representation. Utilized of Informatica IDQ to complete initial data profiling and matching/removing duplicate data. Maintained reliability and availability of existing Data Warehouse, cube, Reporting Server, Integration Service and Analysis Service Instance. Coordinated with business customers to gather business requirements. Created covering index to avoid bookmark lookups and improve query performance. Developed and modified stored procedures for sales force application. Utilized PL/SQL, COGNOS, and TOAD for creating and maintaining ad-hoc reports. Developed single interface to handle both existing XML and JSON traffic via the single MPG using DP rules and XSL. Used FEXPORT and EXPORT to unload data from Teradata to flat file. Documented the procedures for interfacing data from all source systems to the Data Warehousing system. Created Requirements, Solution, External Design, Internal Design, Project Control, Test Plans and turnover documentation. Launched a complete new EDW user's support and Training websites. Partitioned SSAS cubes and assigned appropriate storage modes such as MOLAP and ROLAP as per business requirements. Data Analytics: SQL for newbs, beginners and marketers, A beginners guide to Data Warehouse and Big Data, Complete Data Science Training with Python for Data Analysis, Data Warehouse Concepts: Basic to Advanced concepts, 10 Career Change Resume Tips (with examples), Software Development Lifecycle Management, Established new worfklowÂ processes that resulted in 150% speed gains for the company, Managed 10+ data source systems that turned in 20M profit for multiple projects, Revamped custom software models to boost corporate revenue by 45%, Payroll Administrative Project Coordinator, EstÃ¡giario - (Ã¡rea: Engenharia De Dados). Created SSAS cubes, hierarchies to provide user dealer's views of sales products, subscription, new deals and cancellation. Designed various mappings (Source-to-Target) using Data Stage to link between different source systems and Destination Systems. Designed & developed proof-of-concept solutions addressing business requirements. Developed Perl web pages for new application features. Data Analysis and Exploration. Used cognos frame work manager for pulling the data for online reporting and analysis. Developed and maintained different Map Reduce, Hive and Pig jobs through workflows in Oozie. Optimized the embedded application T-SQL code as well as the stored procedure that is used to feed reports. Developed generic translation modules to convert OLTP data to be stored in the warehouse database. Automated data loading, balancing procedures and designed cross-reference procedures to link customers to their accounts. Performed visualizations according to business requirements on custom visualization tool built in Angular JS. Developed mappings to read different sources like mainframe files, flat file, SQL Server, Oracle db. Developed Complex PL/SQL Data loading packages (working with SQL*Loader) for Postal reporting system (ICPAS). Created PIG Latin scripting and Sqoop Scripting. Established custom software team to increase corporate revenue over 30%. Designed and developed the ETL Mappings for the source systems data extractions, data transformations and aggregation. Developed Java Map Reduce programs on ITCM log data to transform into structured way. Involved in logging the sensitive data using log4j with flume agent property settings. Used shell script to make independent files and load onto Teradata EDW. Created Repository using Repository Manager. Exported filtered data into HBase for fast query. Designed dynamic data repository structure to accommodate a fluctuating data sources for external data. Involved in design of dimensional database - Star schema and creation of physical tables in Oracle. Maintained user groups, privileges/rights and passwords in BrioQuery repository. Exported the aggregated data onto Oracle using Sqoop for reporting on the dashboard. Coordinated with subject matter experts and the QA team for timing and sequencing launch. Developed multiple Proof-Of-Concepts to justify viability of the ETL solution including performance and compliance to non-functional requirements. Worked on Java transformation to develop a code to create date list between 2 dates to load them into Fact table. Used DataStage Director to execute, monitor execution status and log view, also used in scheduling jobs and batches. Monitored the execution of UNIX shell scripts to insure the successful completion of all our Business Intelligence processes. Designed, developed and implemented PowerPlay cubes using Cognos Transformer. Scheduled workflow using Linux Shell scripts. A data warehouse is a central repository of information that can be analyzed to make more informed decisions. ETL. Developed various bulk load and update procedures and processes using SQL*Loader and PL/SQL in Oracle 9i Environment. Involved in creating Oozie workflow and Coordinator jobs to kick off the jobs on time for data availability. Developed PL/SQL procedures & functions to support the reports by retrieving the data from the data warehousing application. Used various SSIS tasks such as Conditional Split, Derived Column for data scrubbing, and data validation checks during staging. Developed Search engine to fix and match dirty description of items against database by using PL/SQL and regular expression. Interpersonal Skills: The Data Warehouse Engineer has to be an individual with a positive can-do attitude, be open and welcoming to change, be a self-starter and be self-motivated, have an insatiable thirst for … Developed MapReduce jobs to read and write Parquet files. Enhanced and expanded the encounter data warehouse model through sound detailed analysis of business requirements. Started VHA Metadata Repository Working Group. Designed Federated Database for Normalizing & Standardizing data from heterogeneous system in top down approach. Developed end-to-end ETL process documents. Developed and created logical and physical Database architecture utilizing ER-Win Data Modeler. Provided full support on Smart Link application and responsible to update feedback mail box as well as issue log. Involved in creating packages, procedures, Functions & Triggers and also embedding dynamic SQL features advanced packages in PL/SQL. Designed and implemented a Star-Schema Data Warehouse in SQL Server that is used as a source for Reports. Here's how Unix is used in Data Warehousing Engineer jobs: Created the … Data Warehouse Analyst Resume Examples. In computing, a data warehouse (DW or DWH), also known as an enterprise data warehouse (EDW), is a system used for reporting and data analysis, and is considered a core component of business … Analyzed data using Hadoop components Hive and Pig and created tables in hive for the end users. Created Session & Repository Variables to achieve desired functionality. Used Impala for performance tuning to handle high concurrency queries run by various teams on HDFS. Used DB2 Stages to Read Data and Transformed into Target SQL tables using various Transformation Rules (Business Rules). Used SSAS to create cubes with various measures and dimensions for financial reports generation. Used various built in transformations in SSIS including SCD, Aggregate, Lookup etc. Developed UNIX shell scripts to load the flat files from the source system into the tables in staging area. Experienced in loading and transforming large sets of structured, semi-structured and unstructured data Hadoop concepts. Created Business Requirement and High-level design documents.
Acer Griseum Multi Stem, Reclaimed Cedar Projects, Liquidity Preference Theory Slideshare, Mill River Senior Apartments, Konkani Amchi Food, Disadvantage Of Twin Lens Reflex Camera, Hadoop Operations Pdf,