Any visa is fine ….
send resume to : email@example.com
Data Architect / modeler 6 positions
We need to fill six positions (2 seniors (10 to 15 yrs exp) and 4 juniors (6 to 10 yrs exp) ) all based out of Delaware office. To keep this simple please focus on the following bullets for candidate selections. I have provided more details in the brackets for each bullet item. Anyone less than 5 years of experience won’t be considered as Data Architect / modeler.
5-10 years experience using Data Modeling tools: ERwin (This is must as they use this tool)
- Experience with large scale data warehousing environments. (This is also must so find folks with multiple EDW deployment experience where they have played the role of Data Architect/Data Modeler)
- Experience/understanding of business process reengineering and business modeling concepts, business systems development and analysis (The candidate should have done this with business folks / power users)
- Experience with Teradata. Experience with Hadoop, Cassandra, and HIVE is a plus. (find folks who have designed database models for Teradata, Big Data(Hive), NoSQL databases (Cassandra/Mongo DB/CouchDB/HBase, etc)
- Experience with change management procedures (more from Data model and model remediation perspective)
- Experience with Data Governance (experience with Tools such as Collibra, Apache Atlas, IBM DGC, etc.)
Send resumes to : firstname.lastname@example.org
Need resources with status : EADs(H4/L2), GC, USC only
SAS Reports Developer
Tampa, FL (Non-Local, OK)
Phone then Skype
- 5+ years SAS reporting experience
- SQL (must be able to write queries from scratch)
- Healthcare exp (healthcare data, members or providers side, commercial ins or medicare/Medicaid claims data) ie: BCBS/Aetna/United
- Other reporting tools
- Proficient in Developing SAS Programs , Macros Including Advanced SAS Programming techniques for Data Accessing, Data cleansing, Data management, and reporting.
- Experienced in extracting the data from Oracle , SQL Server databases , Green Plum and from SPDS efficiently using SAS/Access
- Experienced in debugging SAS code, compiling errors, review SAS code
- Experience in Creating Excel reports using SAS / ODS and Report automation processes
- Experienced in SDLC methodologies
- Production Jobs monitoring and handling the job failures to keep all the reporting processes up and running
- Flexibility in resolving the Critical Report issues for Business/Market/State to meet the stringent deadlines
Technical Skills :
- Base SAS v9.3, SAS Macros, SAS/Access, SAS/ODS , strong in SQL
- Good experience in Unix and Basic Shell Scripting
- Knowledge in VB Scripting would be added advantage
- SAS/SPDS experience would be plus
- Version control tools like Team Foundation Server
- Exposure to Autosys
Send resumes to : email@example.com
Big Data Developer
7+ month contract
NO OPT/CPT/H1, Need only EADs(H4/L2), GC and USC
Qualifications: Graduate degree in Computer Science, Information Systems or equivalent quantitative field and 5+ years of experience in reporting and analytics role Experience working with and extracting value from large, disconnected and/or unstructured datasets Demonstrated ability to build processes that support data transformation, data structures, metadata, dependency and workload management Strong interpersonal skills and ability to project manage and work with cross-functional teams Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases, Hadoop, PIG, Datameer or other big data tools. Experience building and optimizing ‘big data’ data pipelines, architectures and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Experience with the following tools and technologies: Hadoop, Spark, Kafka, Pig, Datameer Relational SQL and NoSQL databases
Send resume to : firstname.lastname@example.org
Need EADs(H4/L2), GC, USC
BI ETL Developer
NY,NY (Final F2F required but will consider non-local as long as they will attend F2F)
Phone/F2F (Will skype non-local before F2F (Client will assist in cost of F2F)
- Bachelor’s degree or higher or equivalent, relevant equivalent experience.
- Minimum of 5 years of experience in ETL development in business intelligence/data integration projects.
- 5+ years of experience working with Informatica, GoldenGate, etc.
- 5+ years of experience working with Relational Databases like Teradata, Vertica, Oracle, SQL Server, etc.
- 3+ years of experience working with Unix Shell Scripts.
- Experience in writing complex SQLs to analyze and troubleshoot the issues/problems.
- Strong data analysis and troubleshooting skills.
- Experience in integrating technologies with internal and external data sources.
- Goal oriented and creative personality with good interpersonal skills.
- Ability to manage multiple assignments simultaneously and follow up on unfinished business.
- Ability to work independently and in a team environment.
- Ability to pay attention to the details and be organized.
- Commitment to internal client and customer service principles.
- Willingness to take initiative and to follow through on projects.
- Excellent time management skills, with the ability to prioritize and multi-task in a fast-paced environment.
- Excellent communication and motivation skills, and ability to interact appropriately with senior-level colleagues and business users.
- Excellent time management skills, with the ability to prioritize and multi-task, and work under shifting deadlines in a fast-paced environment.
- Must have legal right to work in the U.S.
- Experience sourcing data using API calls (REST/SOAP) is a plus.
- Experience working with sourcing and uploading data to and from cloud systems like AWS/GCP is a plus.
- Strong Teradata desired. Oracle, MS SQL Server, Vertica, Netezza, etc. can be alternates. Comprehensive understanding of Data Warehouse design principles.
- Ability to proactively identify, troubleshoot and resolve complex data integrity issues.
- Subscription-based industry experience preferred.
- Knowledge and understanding of REST/SOAP protocols desirable.
- Understanding of data integration between on-premise and cloud-based data systems.
- Understanding of Big Data technologies and tools like Hadoop, HDFS, Hive, Spark, Flink, Kafka, Scoop.
- Thorough knowledge of MS-Office Suite (Word, Excel, PowerPoint, Access).
The ETL Developer will work closely with the Product Owner, Scrum Master, ETL Developers and various business stakeholders to support and provide quality of delivery for clients’s data extraction, transformation and load needs through the development of new and/or enhancement of existing routines, or the management of contractor-sourced resources. Also implement quality control and audit practices to ensure continuous improvement of client’s assets.
Duties and Responsibilities:
- Work with various business stakeholders to identify their information needs.
- Work with data architecture team to understand the source and target data models.
- Work closely with cross-functional business and business intelligence teams to document ETL requirements and turn them into ETL jobs.
- Develop and be a proponent for Data Warehouse development standards, including ETL standards and best practices.
- Provide documentation for all developed ETL processes, including process flow and source and target mapping.
- Conduct ETL performance tuning and troubleshooting.
- Work with business intelligence operations team to ensure control objectives are implemented for all data movement processes, implement change control for production data, establish and follow proper incident management procedures.
- Ensure compliance with all applicable data privacy regulations and policies as they relate to both firm and client/contact data.
Please send resumes to : email@example.com
Digital engineers hiring model- TOP Priority- Location- Cary, NC (Primary) and NY. High-end Developer…the underlined skills are very important…
Any visa is fine as long as they meet the criteria
F2F – Preferred, but Can do SKYPE and for Final round Video call and online coding.
Need 10 High end engineers – Digital apps Engineer for Cash Manager Digitization requirement ( Java Spring, Kafka, RESTful services, Cassandra)-
Online test is a must for all candidates.
- Tech skills: High end engineering skills in Java, Springboot, microservices, Lambda architecture, Kafta, data pipes, cassandra, Mongodo
- Experience in large digital transformation & proven experience of implementation complex transactional system (e.g Banking, payments , ecommerce, Insurance)
- Experience in Microservices and REST API design and implementation
- Experience in DataAPIs
- Hands on experience in Spring Boot, Microservices, Kafka and Cassandra No SQL Database, Hadoop
- Must have experience in different frameworks – Spring / Hibernate / ORM technologies
- Knowledge/Experience in following areas is essential: SQL, Stored Proc, XML, Drools (Rules Engine), JSP , Servlet, WebSphere , Tomcat, SOAP/REST webservices, Java / J2EE security, Spark/Scala, Kafka Messaging
- Java 8 & Drools experience
- Experience with open source BPM Tools (especially tools like jBPM, Activiti, etc.)
- Experience in implementing Microservices based applications and deployment to PaaS environments (Openshift)
- Knowledge of different Agile processes
Need EADs(H4/L2), GC, USC
Please send your resume to : firstname.lastname@example.org
Need Solution Architect with below skills
Solution Architect (1open)- For use with the following Technology Skill Sets Java, Angular, UI/UX, HTML, CSS ASP, .NET, SQL Server, MySQL Windows, Linux, Oracle, JS, UNIX, Sharepoint, BI tools, Informatica, ETL, Flash, Docker Puppet, Bamboo, Jenkins, Tomcat, Tcserver, Apache, IIS, Nginx, Hadoop
Please share all your resumes : email@example.com
Need EADs(H4/L2/OPTs), H1, GC, USC
We currently have an opportunity for a Hadoop Developer . This is 12 months contract opportunity with possible extension in Plymouth, MN. I have included a brief description below on what the Client is seeking:
· Extensive Experience as a Hadoop Developer.
· Experience with Sqoop, Kafka, Spark, Pig, and Hive.
· Experience with Shell Scripts
· Strong in SQL – Hive
Please send all resume to : firstname.lastname@example.org
Need EADs(H4/L2/OPTs), GC, USC
For large web hosting company.
3-5 positions available. candidate must be based in Boston or willing to relocate or commute to Boston and Houston, TX.
Analytics/Data Science Engineer:
3+ years of expertise in Big Data technologies – ideally we’re looking
3+ for folks with Hadoop & Hive experience, but we can also consider
3+ folks who don’t have Hadoop experience but have experience with other
3+ big data technologies like Cassandra or MongoDB years of expertise in
3+ writing SQL queries and ETL/ELT jobs using Spark
3+ (java/python/r/scala) to pull/push data into the big date
2+ year of experience in Machine learning
Any experience in implementing Big Data/Data Lake and Microservices integration solutions on clouds such as AWS, Azure, GCP, etc is a plus.
Any experience with BI tools like Tableau is a plus.