BigData Architect for Delaware, USA

Send resume to : am@truceaid.com

Any visa id fine as long as it meet the below criteria….

We need someone with 14-15 years of experience with last 5 years working on “technology to Limit as BigData Architect having hive, HDFS, Hbase, YARN, Spark, Sentry or Ranger experience. Client is migrating their tenants applications from Cloudera distribution to Hortonworks, so someone having experience with those nuances will be good to have.”

Location: Delaware

Big Data Architects JD: 1 Position

  • 12+ years of hands on experience in variety of platform & data development and architect roles
  • 5+ years of experience in big data technology with experience ranging from platform architecture, data management, data architecture and application architecture
  • Expert in designing and implementation solutions for multi-tenancy with ability to drive automation
  • Good understanding of current industry landscape and trends on tools to ingestion, processing, consumption and security management
  • 2+ years of experience working on available data warehouse solutions in AWS
  • Proficiency working with Hadoop platform including Kafka, Spark, Hbase, Impala, Hive and HDFS in multi-tenant environments
  • Solid base in data technologies like warehousing, ETL, MDM, DQ, BI and analytical tools extensive experience in metadata management and data quality processes and tools
  • Experience in full lifecycle architecture guidance
  • Advanced analytical thinking and problem solving skills

Data Architect / modeler for Delaware, USA

Any visa is fine ….

send resume to : am@truceaid.com

Data Architect / modeler 6 positions

We need to fill six positions (2 seniors (10 to 15 yrs exp) and 4 juniors (6 to 10 yrs exp) ) all based out of Delaware office. To keep this simple please focus on the following bullets for candidate selections. I have provided more details in the brackets for each bullet item. Anyone less than 5 years of experience won’t be considered as Data Architect / modeler.

 5-10 years experience using Data Modeling tools: ERwin (This is must as they use this tool)

  • Experience with large scale data warehousing environments. (This is also must so find folks with multiple EDW deployment experience where they have played the role of Data Architect/Data Modeler)
  • Experience/understanding of business process reengineering and business modeling concepts, business systems development and analysis (The candidate should have done this with business folks / power users)
  • Experience with Teradata. Experience with Hadoop, Cassandra, and HIVE is a plus. (find folks who have designed database models for Teradata, Big Data(Hive), NoSQL databases (Cassandra/Mongo DB/CouchDB/HBase, etc)
  • Experience with change management procedures (more from Data model and model remediation perspective)
  • Experience with Data Governance (experience with Tools such as Collibra, Apache Atlas, IBM DGC, etc.)

Big Data Architects for Delaware, USA

Any visa is fine….

send resume to: am@truceaid.com

Long term contract

Location: Delaware

Big Data Architects JD: 1 Position

  • 12+ years of hands on experience in variety of platform & data development and architect roles
  • 5+ years of experience in big data technology with experience ranging from platform architecture, data management, data architecture and application architecture
  • Expert in designing and implementation solutions for multi-tenancy with ability to drive automation
  • Good understanding of current industry landscape and trends on tools to ingestion, processing, consumption and security management
  • 2+ years of experience working on available data warehouse solutions in AWS
  • Proficiency working with Hadoop platform including Kafka, Spark, Hbase, Impala, Hive and HDFS in multi-tenant environments
  • Solid base in data technologies like warehousing, ETL, MDM, DQ, BI and analytical tools extensive experience in metadata management and data quality processes and tools
  • Experience in full lifecycle architecture guidance
  • Advanced analytical thinking and problem solving skills

Hadoop Developer, Plymouth, MN, USA


Send resume to : am@truceaid.com

Need EADs(H4/L2), GC, USC only

We currently have an opportunity for a Hadoop Developer. This is 12 months contract opportunity with possible extension in Plymouth, MN. I have included a brief description below on what the Client is seeking:

Skype hire

 Requirements

· Extensive Experience as a Hadoop Developer.

· Experience with Sqoop, Kafka, Spark, Pig, and Hive.

· Experience with Shell Scripts

· Strong in SQL – Hive

BI ETL Developer for NY, USA


Send resume to : am@truceaid.com

Need EADs(H4/L2), GC, USC

BI ETL Developer 

NY,NY (Final F2F required but will consider non-local as long as they will attend F2F)

5+ months

Phone/F2F (Will skype non-local before F2F (Client will assist in cost of F2F)

Required

  • Bachelor’s degree or higher or equivalent, relevant equivalent experience.
  • Minimum of 5 years of experience in ETL development in business intelligence/data integration projects.
  • 5+ years of experience working with Informatica, GoldenGate, etc.
  • 5+ years of experience working with Relational Databases like Teradata, Vertica, Oracle, SQL Server, etc.
  • 3+ years of experience working with Unix Shell Scripts.
  • Experience in writing complex SQLs to analyze and troubleshoot the issues/problems.
  • Strong data analysis and troubleshooting skills.
  • Experience in integrating technologies with internal and external data sources.
  • Goal oriented and creative personality with good interpersonal skills.
  • Ability to manage multiple assignments simultaneously and follow up on unfinished business.
  • Ability to work independently and in a team environment.
  • Ability to pay attention to the details and be organized.
  • Commitment to internal client and customer service principles.
  • Willingness to take initiative and to follow through on projects.
  • Excellent time management skills, with the ability to prioritize and multi-task in a fast-paced environment.
  • Excellent communication and motivation skills, and ability to interact appropriately with senior-level colleagues and business users.
  • Excellent time management skills, with the ability to prioritize and multi-task, and work under shifting deadlines in a fast-paced environment.
  • Must have legal right to work in the U.S.

Technical Skills:

  • Experience sourcing data using API calls (REST/SOAP) is a plus.
  • Experience working with sourcing and uploading data to and from cloud systems like AWS/GCP is a plus.
  • Strong Teradata desired. Oracle, MS SQL Server, Vertica, Netezza, etc. can be alternates. Comprehensive understanding of Data Warehouse design principles.
  • Ability to proactively identify, troubleshoot and resolve complex data integrity issues.
  • Subscription-based industry experience preferred.
  • Knowledge and understanding of REST/SOAP protocols desirable.
  • Understanding of data integration between on-premise and cloud-based data systems.
  • Understanding of Big Data technologies and tools like Hadoop, HDFS, Hive, Spark, Flink, Kafka, Scoop.
  • Thorough knowledge of MS-Office Suite (Word, Excel, PowerPoint, Access).

Position Summary:

The ETL Developer will work closely with the Product Owner, Scrum Master, ETL Developers and various business stakeholders to support and provide quality of delivery for clients’s data extraction, transformation and load needs through the development of new and/or enhancement of existing routines, or the management of contractor-sourced resources. Also implement quality control and audit practices to ensure continuous improvement of client’s assets.

Duties and Responsibilities:

  • Work with various business stakeholders to identify their information needs.
  • Work with data architecture team to understand the source and target data models.
  • Work closely with cross-functional business and business intelligence teams to document ETL requirements and turn them into ETL jobs.
  • Develop and be a proponent for Data Warehouse development standards, including ETL standards and best practices.
  • Provide documentation for all developed ETL processes, including process flow and source and target mapping.
  • Conduct ETL performance tuning and troubleshooting.
  • Work with business intelligence operations team to ensure control objectives are implemented for all data movement processes, implement change control for production data, establish and follow proper incident management procedures.
  • Ensure compliance with all applicable data privacy regulations and policies as they relate to both firm and client/contact data.

Supervisory Responsibilities:

  • None .

Need Hadoop Developer in Plymouth, MN


Please share all your resumes : am@truceaid.com

Need EADs(H4/L2/OPTs), H1, GC, USC

We currently have an opportunity for a Hadoop Developer . This is 12 months contract opportunity with possible extension in Plymouth, MN. I have included a brief description below on what the Client is seeking:

Skype hire

Requirements

· Extensive Experience as a Hadoop Developer.

· Experience with Sqoop, Kafka, Spark, Pig, and Hive.

· Experience with Shell Scripts

· Strong in SQL – Hive

Analytics/Data Science Engineer, MA and TX, USA


Please send all resume to : am@truceaid.com

Need EADs(H4/L2/OPTs), GC, USC

For large web hosting company.

3-5 positions available. candidate must be based in Boston or willing to relocate or commute to Boston and Houston, TX.
Analytics/Data Science Engineer:
3+ years of expertise in Big Data technologies – ideally we’re looking
3+ for folks with Hadoop & Hive experience, but we can also consider
3+ folks who don’t have Hadoop experience but have experience with other
3+ big data technologies like Cassandra or MongoDB years of expertise in
3+ writing SQL queries and ETL/ELT jobs using Spark
3+ (java/python/r/scala) to pull/push data into the big date
3+ repositories
2+ year of experience in Machine learning
Any experience in implementing Big Data/Data Lake and Microservices integration solutions on clouds such as AWS, Azure, GCP, etc is a plus.
Any experience with BI tools like Tableau is a plus.