Send resume to : firstname.lastname@example.org
Any visa id fine as long as it meet the below criteria….
We need someone with 14-15 years of experience with last 5 years working on “technology to Limit as BigData Architect having hive, HDFS, Hbase, YARN, Spark, Sentry or Ranger experience. Client is migrating their tenants applications from Cloudera distribution to Hortonworks, so someone having experience with those nuances will be good to have.”
Big Data Architects JD: 1 Position
- 12+ years of hands on experience in variety of platform & data development and architect roles
- 5+ years of experience in big data technology with experience ranging from platform architecture, data management, data architecture and application architecture
- Expert in designing and implementation solutions for multi-tenancy with ability to drive automation
- Good understanding of current industry landscape and trends on tools to ingestion, processing, consumption and security management
- 2+ years of experience working on available data warehouse solutions in AWS
- Proficiency working with Hadoop platform including Kafka, Spark, Hbase, Impala, Hive and HDFS in multi-tenant environments
- Solid base in data technologies like warehousing, ETL, MDM, DQ, BI and analytical tools extensive experience in metadata management and data quality processes and tools
- Experience in full lifecycle architecture guidance
- Advanced analytical thinking and problem solving skills
Send resume to : email@example.com
Need EADs(H4/L2), GC, USC
BI ETL Developer
NY,NY (Final F2F required but will consider non-local as long as they will attend F2F)
Phone/F2F (Will skype non-local before F2F (Client will assist in cost of F2F)
- Bachelor’s degree or higher or equivalent, relevant equivalent experience.
- Minimum of 5 years of experience in ETL development in business intelligence/data integration projects.
- 5+ years of experience working with Informatica, GoldenGate, etc.
- 5+ years of experience working with Relational Databases like Teradata, Vertica, Oracle, SQL Server, etc.
- 3+ years of experience working with Unix Shell Scripts.
- Experience in writing complex SQLs to analyze and troubleshoot the issues/problems.
- Strong data analysis and troubleshooting skills.
- Experience in integrating technologies with internal and external data sources.
- Goal oriented and creative personality with good interpersonal skills.
- Ability to manage multiple assignments simultaneously and follow up on unfinished business.
- Ability to work independently and in a team environment.
- Ability to pay attention to the details and be organized.
- Commitment to internal client and customer service principles.
- Willingness to take initiative and to follow through on projects.
- Excellent time management skills, with the ability to prioritize and multi-task in a fast-paced environment.
- Excellent communication and motivation skills, and ability to interact appropriately with senior-level colleagues and business users.
- Excellent time management skills, with the ability to prioritize and multi-task, and work under shifting deadlines in a fast-paced environment.
- Must have legal right to work in the U.S.
- Experience sourcing data using API calls (REST/SOAP) is a plus.
- Experience working with sourcing and uploading data to and from cloud systems like AWS/GCP is a plus.
- Strong Teradata desired. Oracle, MS SQL Server, Vertica, Netezza, etc. can be alternates. Comprehensive understanding of Data Warehouse design principles.
- Ability to proactively identify, troubleshoot and resolve complex data integrity issues.
- Subscription-based industry experience preferred.
- Knowledge and understanding of REST/SOAP protocols desirable.
- Understanding of data integration between on-premise and cloud-based data systems.
- Understanding of Big Data technologies and tools like Hadoop, HDFS, Hive, Spark, Flink, Kafka, Scoop.
- Thorough knowledge of MS-Office Suite (Word, Excel, PowerPoint, Access).
The ETL Developer will work closely with the Product Owner, Scrum Master, ETL Developers and various business stakeholders to support and provide quality of delivery for clients’s data extraction, transformation and load needs through the development of new and/or enhancement of existing routines, or the management of contractor-sourced resources. Also implement quality control and audit practices to ensure continuous improvement of client’s assets.
Duties and Responsibilities:
- Work with various business stakeholders to identify their information needs.
- Work with data architecture team to understand the source and target data models.
- Work closely with cross-functional business and business intelligence teams to document ETL requirements and turn them into ETL jobs.
- Develop and be a proponent for Data Warehouse development standards, including ETL standards and best practices.
- Provide documentation for all developed ETL processes, including process flow and source and target mapping.
- Conduct ETL performance tuning and troubleshooting.
- Work with business intelligence operations team to ensure control objectives are implemented for all data movement processes, implement change control for production data, establish and follow proper incident management procedures.
- Ensure compliance with all applicable data privacy regulations and policies as they relate to both firm and client/contact data.