Duties shown on sample resumes of BI Developers include designing reports based on business requirements while using SSRS, designing ETL loads to load data into the reporting database using SSIS, and creating stored procedures and functions required to extract data for the load. Configuring and working With Oracle BI Scheduler, delivers, Publisher and configuring iBots. Experience in various data ingestion patterns to hadoop. Developed jobs in both Talend (Talend Platform for MDM with Big Data ) and Talend (Talend Data Fabric). Involved in creating new stored procedures and optimizing existing queries and stored procedures. Download your resume, Easy Edit, Print it out and Get it a ready interview! Design conceptual and logical data models and all associated documentation and definition. Designed Mapping document, which is a guideline to ETL Coding. Expertise in creating and configuring Oracle BI repository. Development of new reports as per the Cisco business requirements which involves changes in the design of ETL and new DB objects along with the reports. Senior ETL Developer Resume Samples | QwikResume Used Avro, Parquet and ORC data formats to store in to HDFS. the experience section). Explore sample code, download tools, and connect with peers. 130 jobs. Worked on Snowflake Shared Technology Environment for providing stable infrastructure, secured environment, reusable generic frameworks, robust design architecture, technology expertise, best practices and automated SCBD (Secured Database Connections, Code Review, Build Process, Deployment Process) utilities, Worked on Snowflake Schemas and Data Warehousing. Senior Software Engineer - Snowflake Developer. Created Different types of Dimensional hierarchies. Have good Knowledge in ETL and hands on experience in ETL. Apply to Business Intelligence Developer, Data Analyst Manager, Front End Associate and more! Converted around 100 views queries from Oracle server to snowflake compatibility, and created several secure views for downstream applications. Tested Standard and ad-hoc SQL Server reports and compared the results against the database by writing SQL Queries. Dataflow design for new feeds from Upstream. Provided the Report Navigation and dashboard Navigations. Good working knowledge of any ETL tool (Informatica or SSIS). Creating ETL mappings and different kinds of transformations like Source qualifier, Aggregators, lookups, Filters, Sequence, Stored Procedure and Update strategy. Working on Snowflake modeling and highly proficient in data warehousing techniques for data cleansing, Slowly Changing Dimension phenomenon, surrogate key assignment and change data capture. Worked with Kimball's Data Modeling concepts including data modeling, data marts, dimensional modeling, star and snowflake schema, fact aggregation and dimension tables . Here are a few tweaks that could improve the score of this resume: By clicking Build your own now, you agree to ourTerms of UseandPrivacy Policy. As such, it is not owned by us, and it is the user who retains ownership over such content. Developed Talend MDM jobs to populate the claims data to data warehouse - star schema, snowflake schema, Hybrid Schema. Sr. Snowflake Developer Resume 5.00 /5 (Submit Your Rating) Hire Now SUMMARY: Over 13 years of experience in the IT industry having experience in Snowflake, ODI, INFORMATICA, OBIEE, OBIA, and Power BI. 4,473 followers. By clicking Build Your Own Now, you agree to ourTerms of UseandPrivacy Policy. Data validations have been done through information_schema. Implemented Security management for users, groups and web-groups. Good understanding of Teradata SQL, Explain command, Statistics, Locks and creation of Views. Developed data warehouse model in snowflake for over 100 datasets using whereScape. Led a team to migrate a complex data warehouse to Snowflake, reducing query times by 50%. Strong knowledge of SDLC (viz. Involved in converting Hive/SQL quries into Spark transformation using Spark RDDs. Privacy policy Experience in analyzing data using HiveQL, Participate in design meetings for creation of the Data Model and provide guidance on best data architecture practices. Excellent knowledge of Data Warehousing Concepts. Developed highly optimized stored procedures, functions, and database views to implement the business logic also created clustered and non-clustered indexes. Responsible to implement coding standards defined by snowflake. Used ETL to extract files for the external vendors and coordinated that effort. BI Publisher reports development; render the same via BI Dashboards. Extensive experience in developing complex stored Procedures/BTEQ Queries. ETL TClaireClairels: InfClairermatica PClairewer Center 10.4/10.9/8.6/7.13 MuleSClaireft, InfClairermatica PClairewer Exchange, InfClairermatica data quality (IDQ). reports validation, job re-runs. Participated in client business need discussions and translating those needs into technical executions from a data standpoint. Snowflake Developer Pune new Mobile Programming LLC Pune, Maharashtra 2,66,480 - 10,18,311 a year Full-time Monday to Friday + 2 Akscellence is Hiring SAP BW & Snowflake Developer, Location new Akscellence Info Solutions Remote Good working knowledge of SAP BW 7.5 Version. Experience in using Snowflake zero copy Clone, SWAP, Time Travel and Different Table types. Postproduction validations - code validation and data validation after completion of 1st cycle run. For long running scripts, queries identifying join strategies, issues, bottlenecks and implementing appropriate performance tuning methods. Informatica Developer Resume Samples. Developed Talend ETL jobs to push the data into Talend MDM and develop the jobs to extract the data from MDM. Create apps that auto-scale and can be deployed globally. Analyse, design, code, unit/system testing, support UAT, implementation and release management. Sr. Snowflake Developer Resume Charlotte, NC - Hire IT People If youre in the middle or are generally looking to make your resume feel more modern and personal, go for the combination or hybrid resume format. Snowflake Developer Dallas, TX $50.00 - $60.00 Per Hour (Employer est.) Responsible for design and build data mart as per the requirements. Created Different types of reports including Union and Merged and prompts in answers and created the Different dashboards. Heavily invClairelved in testing SnClairewflake tClaire understand best pClairessible ways tClaire use the clClaireud resClaireurces. Developed a data validation framework, resulting in a 25% improvement in data quality. Good knowledge on Snowflake Multi - Cluster architecture and components. Building solutions once for all with no band-aid approach. Snowflake Developer Roles And Responsibilities Resume Replication testing and configuration for new tables in Sybase ASE. Snowflake Developer Job Description Technical and Professional Requirements- Minimum 3 years of experience in developing software applications including: analysis, design, coding, testing, deploying and supporting of applications. Experience with Snowflake cloud data warehouse and AWS S3 bucket for continuous data load using Snowpipe. Designing new reports in Jasper using tables, charts and graphs, crosstabs, grouping and sorting. Responsible for Unit, System and Integration testing and performed data validation for all the reports that are generated. Creating new tables and audit process to load the new input files from CRD. *The names and logos of the companies referred to in this page are all trademarks of their respective holders. Involved in performance monitoring, tuning, and capacity planning. Worked on performance tuning/improvement process and QC process, Supporting downstream applications with their production data load issues. Involved in the complete life cycle in creating SSIS packages, building, deploying, and executing the packages in both environments (Development and Production). 40 Snowflake Interview Questions - Interview Kickstart Developed and maintained data pipelines for ETL processes, resulting in a 15% increase in efficiency. Expertise in develClaireping SQL and PL/SQL cClairedes thrClaireugh variClaireus PrClairecedures/FunctiClairens, Packages, CursClairers and Triggers tClaire implement the business lClairegics fClairer database. Expertise in creating Projects, Models, Packages, Interfaces, Scenarios, Filters, Metadata and extensively worked onODIknowledge modules (LKM, IKM, CKM, RKM, JKM and SKM). Tuned the slow performance queries by looking at Execution Plan. Good knowledge on Unix shell scriptingKnowledge on creating various mappings, sessions and Workflows. Proven ability in communicating highly technical content to non-technical people. Optimized the SQL/PLSQL jobs and redacted the jobs execution time. Built and maintained data warehousing solutions using Snowflake, allowing for faster data access and improved reporting capabilities. Designed and implemented a data compression strategy that reduced storage costs by 20%. Ensured accuracy of data and reports, reducing errors by 30%. Designing the database reporting for the next phase of the project. Snowflake Architect & Developer Resume 3.00 /5 (Submit Your Rating) Hire Now SUMMARY: Overall 12+ years of experience in ETL Architecture, ETL Development, Data Modelling, Database Architecture with Talend Bigdata, Lyftron, Informatica, Apache Spark, AWS, NoSql, Mongo, Postgres, AWS Redshift & Snowflake. Delivering and implementing the project as per scheduled deadlines; extending post-implementation and maintenance support to the technical support team and client. More. Experience in Splunk repClairerting system. Created ODI interfaces, functions, procedures, packages, variables, scenarios to migrate the data. Involved in production moves. Coordinates and assists the activities of the team to resolve issues in all areas and provide on time deliverables. Created complex views for power BI reports. DBMS: Oracle,SQL Server,MySql,Db2 Monitored the project processes, making periodic changes and guaranteeing on-time delivery. Created various Reusable and Non-Reusable tasks like Session. Programming Languages: Pl/SQL, Python(pandas),SnowSQL Played key role in MigratingTeradataobjects intoSnowflakeenvironment. Performance tuning of Big Data workloads. Curated by AmbitionBox. As a result, it facilitates easier, faster, and more flexible data processing, data storage, and analytics solutions compared to traditional products. Data Engineer Resume Example - livecareer Involved in various Transformation and data cleansing activities using various Control flow and data flow tasks in SSIS packages during data migration. Strong experience in migrating other databases to Snowflake. Developed workflow in SSIS to automate the tasks of loading the data into HDFS and processing using hive. Performed file, detail level validation and also tested the data flown from source to target. Well versed with Snowflake features like clustering, time travel, cloning, logical data warehouse, caching etc. Created complex mappings in Talend 7.1 using tMap, tJoin, tReplicate, tParallelize, tFixedFlowInput, tAggregateRow, tFilterRow, tIterateToFlow, tFlowToIterate, tDie, tWarn, tLogCatcher, tHiveIput, tHiveOutput, tMDMInput, tMDMOutput etc.