BigData Architect for Delaware, USA

Send resume to : am@truceaid.com

Any visa id fine as long as it meet the below criteria….

We need someone with 14-15 years of experience with last 5 years working on “technology to Limit as BigData Architect having hive, HDFS, Hbase, YARN, Spark, Sentry or Ranger experience. Client is migrating their tenants applications from Cloudera distribution to Hortonworks, so someone having experience with those nuances will be good to have.”

Location: Delaware

Big Data Architects JD: 1 Position

  • 12+ years of hands on experience in variety of platform & data development and architect roles
  • 5+ years of experience in big data technology with experience ranging from platform architecture, data management, data architecture and application architecture
  • Expert in designing and implementation solutions for multi-tenancy with ability to drive automation
  • Good understanding of current industry landscape and trends on tools to ingestion, processing, consumption and security management
  • 2+ years of experience working on available data warehouse solutions in AWS
  • Proficiency working with Hadoop platform including Kafka, Spark, Hbase, Impala, Hive and HDFS in multi-tenant environments
  • Solid base in data technologies like warehousing, ETL, MDM, DQ, BI and analytical tools extensive experience in metadata management and data quality processes and tools
  • Experience in full lifecycle architecture guidance
  • Advanced analytical thinking and problem solving skills

Big Data Architects for Delaware, USA

Any visa is fine….

send resume to: am@truceaid.com

Long term contract

Location: Delaware

Big Data Architects JD: 1 Position

  • 12+ years of hands on experience in variety of platform & data development and architect roles
  • 5+ years of experience in big data technology with experience ranging from platform architecture, data management, data architecture and application architecture
  • Expert in designing and implementation solutions for multi-tenancy with ability to drive automation
  • Good understanding of current industry landscape and trends on tools to ingestion, processing, consumption and security management
  • 2+ years of experience working on available data warehouse solutions in AWS
  • Proficiency working with Hadoop platform including Kafka, Spark, Hbase, Impala, Hive and HDFS in multi-tenant environments
  • Solid base in data technologies like warehousing, ETL, MDM, DQ, BI and analytical tools extensive experience in metadata management and data quality processes and tools
  • Experience in full lifecycle architecture guidance
  • Advanced analytical thinking and problem solving skills