BigData Architect for Delaware, USA

Send resume to : am@truceaid.com

Any visa id fine as long as it meet the below criteria….

We need someone with 14-15 years of experience with last 5 years working on “technology to Limit as BigData Architect having hive, HDFS, Hbase, YARN, Spark, Sentry or Ranger experience. Client is migrating their tenants applications from Cloudera distribution to Hortonworks, so someone having experience with those nuances will be good to have.”

Location: Delaware

Big Data Architects JD: 1 Position

  • 12+ years of hands on experience in variety of platform & data development and architect roles
  • 5+ years of experience in big data technology with experience ranging from platform architecture, data management, data architecture and application architecture
  • Expert in designing and implementation solutions for multi-tenancy with ability to drive automation
  • Good understanding of current industry landscape and trends on tools to ingestion, processing, consumption and security management
  • 2+ years of experience working on available data warehouse solutions in AWS
  • Proficiency working with Hadoop platform including Kafka, Spark, Hbase, Impala, Hive and HDFS in multi-tenant environments
  • Solid base in data technologies like warehousing, ETL, MDM, DQ, BI and analytical tools extensive experience in metadata management and data quality processes and tools
  • Experience in full lifecycle architecture guidance
  • Advanced analytical thinking and problem solving skills

Big Data Architects for Delaware, USA

Any visa is fine….

send resume to: am@truceaid.com

Long term contract

Location: Delaware

Big Data Architects JD: 1 Position

  • 12+ years of hands on experience in variety of platform & data development and architect roles
  • 5+ years of experience in big data technology with experience ranging from platform architecture, data management, data architecture and application architecture
  • Expert in designing and implementation solutions for multi-tenancy with ability to drive automation
  • Good understanding of current industry landscape and trends on tools to ingestion, processing, consumption and security management
  • 2+ years of experience working on available data warehouse solutions in AWS
  • Proficiency working with Hadoop platform including Kafka, Spark, Hbase, Impala, Hive and HDFS in multi-tenant environments
  • Solid base in data technologies like warehousing, ETL, MDM, DQ, BI and analytical tools extensive experience in metadata management and data quality processes and tools
  • Experience in full lifecycle architecture guidance
  • Advanced analytical thinking and problem solving skills

Hadoop Developer, Plymouth, MN, USA


Send resume to : am@truceaid.com

Need EADs(H4/L2), GC, USC only

We currently have an opportunity for a Hadoop Developer. This is 12 months contract opportunity with possible extension in Plymouth, MN. I have included a brief description below on what the Client is seeking:

Skype hire

 Requirements

· Extensive Experience as a Hadoop Developer.

· Experience with Sqoop, Kafka, Spark, Pig, and Hive.

· Experience with Shell Scripts

· Strong in SQL – Hive

Big Data Developer, Chicago, IL, USA


Send resumes to : am@truceaid.com

Big Data Developer

Chicago, IL

7+ month contract

NO OPT/CPT/H1, Need only EADs(H4/L2), GC and USC

 Qualifications: Graduate degree in Computer Science, Information Systems or equivalent quantitative field and 5+ years of experience in reporting and analytics role Experience working with and extracting value from large, disconnected and/or unstructured datasets Demonstrated ability to build processes that support data transformation, data structures, metadata, dependency and workload management Strong interpersonal skills and ability to project manage and work with cross-functional teams Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases, Hadoop, PIG, Datameer or other big data tools. Experience building and optimizing ‘big data’ data pipelines, architectures and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Experience with the following tools and technologies: Hadoop, Spark, Kafka, Pig, Datameer Relational SQL and NoSQL databases

 

DevOps Engineer, St. Louis, MO, USA


send resumes to : am@truceaid.com

DevOps Engineer

Denver (Non-Local accepted)

6+ months

Required Qualifications and Skills

  • Mastery of Python and either Java programming language or other automation scripting languages is required
  • Must have experience with one or more databases (Postgres, MongoDB, REDIS)
  • Must be able to architect, design and implement software solutions to meet the business requirements provided by the Product Owner.
  • Must be able to research and evaluate commercial and open-source software components, and incorporate those in the overall solutions
  • Must have strong algorithm and data structure implementation experience
  • Must have experience with Cloud and web services architectures and technologies, and be able to create scalable, automated and fault-tolerant solutions.
  • Ability to thrive in an agile, fast-paced environment while delivering high quality mission-critical software
  • Passionate about the technology and the craft of software engineering
  • Experience with containerization using Docker, and container orchestration systems like Kubernetes
  • Experience or knowledge of GrahpQL is highly desirable.
  • Experience with Node.js
  • Experience working on highly distributed systems and distributed messaging systems like RabbitMQ, Kafka, AMQP.
  • Experience working with Elastic Search.
  • Have a good understanding of SCM/CICD/DEVOPS best practices and the tools used to implement those (Git, Jenkins, Jira, Ansible/Chef/Puppet).
  • Have a good grasp of the SDN concepts and active participation in the design and development of proprietary and/or open-source SDN controllers is highly desirable.
  • Experience with large-scale data communication network architecture
  • Experience or knowledge of Stackstorm and/or ODL (OpenDayLight).
  • Experience or knowledge of common networking protocols (e.g. BGP, ISIS, OSPF, PCEP).
  • Fluent in other languages like C++, C, and JavaScript.

The major responsibilities include:

  • Architecting, designing and developing software to meet the product functional, performance and business requirements
  • Operationalizing the solution and supporting its lifecycle in production using a DEVOPS approach
  • Research new technologies and adopt suitable technologies to solve the problem at hand

BI ETL Developer for NY, USA


Send resume to : am@truceaid.com

Need EADs(H4/L2), GC, USC

BI ETL Developer 

NY,NY (Final F2F required but will consider non-local as long as they will attend F2F)

5+ months

Phone/F2F (Will skype non-local before F2F (Client will assist in cost of F2F)

Required

  • Bachelor’s degree or higher or equivalent, relevant equivalent experience.
  • Minimum of 5 years of experience in ETL development in business intelligence/data integration projects.
  • 5+ years of experience working with Informatica, GoldenGate, etc.
  • 5+ years of experience working with Relational Databases like Teradata, Vertica, Oracle, SQL Server, etc.
  • 3+ years of experience working with Unix Shell Scripts.
  • Experience in writing complex SQLs to analyze and troubleshoot the issues/problems.
  • Strong data analysis and troubleshooting skills.
  • Experience in integrating technologies with internal and external data sources.
  • Goal oriented and creative personality with good interpersonal skills.
  • Ability to manage multiple assignments simultaneously and follow up on unfinished business.
  • Ability to work independently and in a team environment.
  • Ability to pay attention to the details and be organized.
  • Commitment to internal client and customer service principles.
  • Willingness to take initiative and to follow through on projects.
  • Excellent time management skills, with the ability to prioritize and multi-task in a fast-paced environment.
  • Excellent communication and motivation skills, and ability to interact appropriately with senior-level colleagues and business users.
  • Excellent time management skills, with the ability to prioritize and multi-task, and work under shifting deadlines in a fast-paced environment.
  • Must have legal right to work in the U.S.

Technical Skills:

  • Experience sourcing data using API calls (REST/SOAP) is a plus.
  • Experience working with sourcing and uploading data to and from cloud systems like AWS/GCP is a plus.
  • Strong Teradata desired. Oracle, MS SQL Server, Vertica, Netezza, etc. can be alternates. Comprehensive understanding of Data Warehouse design principles.
  • Ability to proactively identify, troubleshoot and resolve complex data integrity issues.
  • Subscription-based industry experience preferred.
  • Knowledge and understanding of REST/SOAP protocols desirable.
  • Understanding of data integration between on-premise and cloud-based data systems.
  • Understanding of Big Data technologies and tools like Hadoop, HDFS, Hive, Spark, Flink, Kafka, Scoop.
  • Thorough knowledge of MS-Office Suite (Word, Excel, PowerPoint, Access).

Position Summary:

The ETL Developer will work closely with the Product Owner, Scrum Master, ETL Developers and various business stakeholders to support and provide quality of delivery for clients’s data extraction, transformation and load needs through the development of new and/or enhancement of existing routines, or the management of contractor-sourced resources. Also implement quality control and audit practices to ensure continuous improvement of client’s assets.

Duties and Responsibilities:

  • Work with various business stakeholders to identify their information needs.
  • Work with data architecture team to understand the source and target data models.
  • Work closely with cross-functional business and business intelligence teams to document ETL requirements and turn them into ETL jobs.
  • Develop and be a proponent for Data Warehouse development standards, including ETL standards and best practices.
  • Provide documentation for all developed ETL processes, including process flow and source and target mapping.
  • Conduct ETL performance tuning and troubleshooting.
  • Work with business intelligence operations team to ensure control objectives are implemented for all data movement processes, implement change control for production data, establish and follow proper incident management procedures.
  • Ensure compliance with all applicable data privacy regulations and policies as they relate to both firm and client/contact data.

Supervisory Responsibilities:

  • None .

Digital Apps Engineers for NC and NY Locations in USA


Please send resumes to : am@truceaid.com

Digital engineers hiring model- TOP Priority-  Location- Cary, NC (Primary) and NY. High-end Developer…the underlined skills are very important…

Any visa is fine as long as they meet the criteria

F2F – Preferred, but Can do SKYPE and for Final round Video call and online coding.

Need 10 High end engineers – Digital apps Engineer for Cash Manager Digitization requirement ( Java Spring, Kafka, RESTful services, Cassandra)-

Online test is a must for all candidates.

    • Tech skills: High end engineering skills in Java, Springboot, microservices, Lambda architecture, Kafta, data pipes, cassandra, Mongodo
    • Experience in large digital transformation & proven experience of implementation complex transactional system (e.g Banking,  payments , ecommerce, Insurance)
    • Experience in Microservices and REST API design and implementation
    • Experience in DataAPIs
    • Hands on experience in Spring Boot, Microservices, Kafka and Cassandra No SQL Database, Hadoop
    • Must have experience in different frameworks –  Spring / Hibernate / ORM technologies
    •  Knowledge/Experience in following areas is essential: SQL, Stored Proc, XML, Drools (Rules Engine), JSP , Servlet, WebSphere , Tomcat, SOAP/REST webservices, Java / J2EE security, Spark/Scala, Kafka Messaging
    • Java 8 & Drools experience
    • Experience with open source BPM Tools (especially tools like jBPM, Activiti, etc.)
    • Experience in implementing Microservices based applications and deployment to PaaS environments (Openshift)
    • Knowledge of different Agile processes

 

Need Angular Developer in Minneapolis, MN, USA


Please send your resumes to: am@truceaid.com 

I am currently working on a contract role in Minneapolis for an Angular Developer. They are looking for Kafka and NoSQL exp along with Angular front end experience.

This is a 2 month contract as they are putting a scrum team in place and need supplemental development while they build the team. Being that this is shorter term I am really looking for local to MN or close to from the Midwest. There is a chance for extensions but likely only be 2-3 months as they are building a team with full time resources.

 I can work with all visas except STEM OPT.

 Looking for Mid Level Devlopers 

Need Hadoop Developer in Plymouth, MN


Please share all your resumes : am@truceaid.com

Need EADs(H4/L2/OPTs), H1, GC, USC

We currently have an opportunity for a Hadoop Developer . This is 12 months contract opportunity with possible extension in Plymouth, MN. I have included a brief description below on what the Client is seeking:

Skype hire

Requirements

· Extensive Experience as a Hadoop Developer.

· Experience with Sqoop, Kafka, Spark, Pig, and Hive.

· Experience with Shell Scripts

· Strong in SQL – Hive

Digital apps Engineer for Cash Manager Digitization requirement ( Java Spring, Kafka, RESTful services, Cassandra) for NY/NC, USA


Please share all your resume to : am@truceaid.com

Need H4/L2/H1B/GC/USC for this role….

Location : Cary, NC (Primary) and NY, USA

F2F – Preferred, but Can do SKYPE and for Final round Video call and online coding.

  1. 6 High end – Digital apps Engineer for Cash Manager Digitization requirement ( Java Spring, Kafka, RESTful services, Cassandra)-

Online test is a must for all candidates.

    • Tech skills: High end engineering skills in Java, Springboot, microservices, Lambda architecture, Kafta, data pipes, cassandra, Mongodo
    • Experience in large digital transformation & proven experience of implementation complex transactional system (e.g Banking,  payments , ecommerce, Insurance)
    • Experience in Microservices and REST API design and implementation
    • Experience in DataAPIs
    • Hands on experience in Spring Boot, Microservices, Kafka and Cassandra No SQL Database, Hadoop
    • Must have experience in different frameworks –  Spring / Hibernate / ORM technologies
    •  Knowledge/Experience in following areas is essential: SQL, Stored Proc, XML, Drools (Rules Engine), JSP , Servlet, WebSphere , Tomcat, SOAP/REST webservices, Java / J2EE security, Spark/Scala, Kafka Messaging
    • Java 8 & Drools experience
    • Experience with open source BPM Tools (especially tools like jBPM, Activiti, etc.)
    • Experience in implementing Microservices based applications and deployment to PaaS environments (Openshift)
    • Knowledge of different Agile processes