Looking for Hadoop Developer for Strongsville OH, Pittsburgh PA OR Remote Locations

Indotronix International Corporation

Strongsville Ohio

United States

Information Technology
(No Timezone Provided)

Overview Position: Hadoop Developer Duration: Long term Location: Strongsville OH, Pittsburgh PA, Remote Business Need for this Role: Run the bank. Support a couple different mnemonics. Troubleshoot any problems with the applications. New add to team. Process improvements, enhancements. Top Must-Have Skills / Candidate Requirements: Hadoop- Hive, Sqoop, Oozie, Kafka, Spark Python and Scala. Spark Cloudera ETL Informatica- Nice to have Soft Skills: support mindset Degrees/Certifications/Years of Overall Experience Required: Itil foundations- Nice to Have 6-8 years of experience Day to Day Responsibilities: Analysis, designing, development and support in Hadoop infrastructure using Hive, Sqoop, Oozie, Kafka, Spark Python and Scala. Experience in working with Cloudera distribution with various nodes cluster running Spark on YARN. Hands on experience in using Hive to extract, transform and load (ETL) data into a reportable format using Spark environment. Experience in importing and exporting Gigabytes of data between HDFS and Relational Teradata Database using Sqoop. HBase Database skills. Experience in using Parquet file format in Hive and Spark. Clear understanding of Hadoop architecture and various component like Resource manager, node manager, Name node, Data node HDFS etc. Programming knowledge to get the data from Kafka using PySpark and load into Hadoop distributed file system and other downstream applications. Hands on experience in designing ETL Informatica power center 9.1. Experience of Data warehousing concepts like Dimensional modeling like SCD-1, SCD-2 etc., different schemas (Star, Snowflakes etc.), Demoralization etc. Proficient in Data analysis, Data modeling (Physical and logical), That Include ER diagrams, DFD (data flow diagrams) and Database designing. Having Experience to work on Production support project consist of incident investigations, tracking the regular data loading jobs, resolve the tickets etc. Proficient in performance tuning using through execution plan, Hints, Bulk Binding, Pipelined Functions, Partitions, index etc. Having experience of UNIX Shell programing. Pyspark and Scala programing experience. Experience of Core Java, Angular J, JASON, Tableau, CA7, ServiceNow is preferred but not necessary. Day to day development, support and monitoring of our production environments. Leading troubleshooting calls and driving to root cause resolution; makes recommendations for long-term resolution of problems across the enterprise. Provide systems support in and out of office hours. Assist our business partners with customer issues when needed Measure and optimize application performance. Configure, tune and troubleshoot systems to achieve optimal application performance and stability Collabo*** effectively with development, testing and management teams in dispa*** locations. Optimize the reliability and performance of our software solutions Design and implement proactive monitoring to ensure health, performance and security of our production environment Optimize application performance Maintain configuration documentation A self-starter with an inquiring and open mind, detail oriented, capable of stepping back and seeing a bigger picture Demonst***d ability to collabo*** and communicate clearly and effectively with people from both technical and non-technical areas, including client-facing communications and creation of formal documentation. Strong understanding of how technology relates to business, market and industry and applies such knowledge to support the needs of the business. Experience in developing and/or debugging application code and installing third party vendor software Solid experience with Informatica, Oracle, Teradata, Unix, ETL, Power Exchange, PL/SQL uDeploy, Data analysis, Hadoop Previous experience with specific enterprise computing tools: Service Now for Incident, Problem and Change, Connect: DIRECT (NDM) file transmission 6+ years technical experience 3+ years in production support role preferred. Interview Process: Video Interview

Looking for Hadoop Developer for Strongsville OH, Pittsburgh PA OR Remote Locations

Indotronix International Corporation

Strongsville Ohio

United States

Information Technology

(No Timezone Provided)

Overview Position: Hadoop Developer Duration: Long term Location: Strongsville OH, Pittsburgh PA, Remote Business Need for this Role: Run the bank. Support a couple different mnemonics. Troubleshoot any problems with the applications. New add to team. Process improvements, enhancements. Top Must-Have Skills / Candidate Requirements: Hadoop- Hive, Sqoop, Oozie, Kafka, Spark Python and Scala. Spark Cloudera ETL Informatica- Nice to have Soft Skills: support mindset Degrees/Certifications/Years of Overall Experience Required: Itil foundations- Nice to Have 6-8 years of experience Day to Day Responsibilities: Analysis, designing, development and support in Hadoop infrastructure using Hive, Sqoop, Oozie, Kafka, Spark Python and Scala. Experience in working with Cloudera distribution with various nodes cluster running Spark on YARN. Hands on experience in using Hive to extract, transform and load (ETL) data into a reportable format using Spark environment. Experience in importing and exporting Gigabytes of data between HDFS and Relational Teradata Database using Sqoop. HBase Database skills. Experience in using Parquet file format in Hive and Spark. Clear understanding of Hadoop architecture and various component like Resource manager, node manager, Name node, Data node HDFS etc. Programming knowledge to get the data from Kafka using PySpark and load into Hadoop distributed file system and other downstream applications. Hands on experience in designing ETL Informatica power center 9.1. Experience of Data warehousing concepts like Dimensional modeling like SCD-1, SCD-2 etc., different schemas (Star, Snowflakes etc.), Demoralization etc. Proficient in Data analysis, Data modeling (Physical and logical), That Include ER diagrams, DFD (data flow diagrams) and Database designing. Having Experience to work on Production support project consist of incident investigations, tracking the regular data loading jobs, resolve the tickets etc. Proficient in performance tuning using through execution plan, Hints, Bulk Binding, Pipelined Functions, Partitions, index etc. Having experience of UNIX Shell programing. Pyspark and Scala programing experience. Experience of Core Java, Angular J, JASON, Tableau, CA7, ServiceNow is preferred but not necessary. Day to day development, support and monitoring of our production environments. Leading troubleshooting calls and driving to root cause resolution; makes recommendations for long-term resolution of problems across the enterprise. Provide systems support in and out of office hours. Assist our business partners with customer issues when needed Measure and optimize application performance. Configure, tune and troubleshoot systems to achieve optimal application performance and stability Collabo*** effectively with development, testing and management teams in dispa*** locations. Optimize the reliability and performance of our software solutions Design and implement proactive monitoring to ensure health, performance and security of our production environment Optimize application performance Maintain configuration documentation A self-starter with an inquiring and open mind, detail oriented, capable of stepping back and seeing a bigger picture Demonst***d ability to collabo*** and communicate clearly and effectively with people from both technical and non-technical areas, including client-facing communications and creation of formal documentation. Strong understanding of how technology relates to business, market and industry and applies such knowledge to support the needs of the business. Experience in developing and/or debugging application code and installing third party vendor software Solid experience with Informatica, Oracle, Teradata, Unix, ETL, Power Exchange, PL/SQL uDeploy, Data analysis, Hadoop Previous experience with specific enterprise computing tools: Service Now for Incident, Problem and Change, Connect: DIRECT (NDM) file transmission 6+ years technical experience 3+ years in production support role preferred. Interview Process: Video Interview