Sr Big Data Engineer - Permanent - Remote

Nigel Frank International

Port Washington New York

United States

Information Technology
(No Timezone Provided)

Sr Big Data Engineer - Permanent - Remote
The Role
The Sr. Big Data Engineer will be the senior person responsible for big data engineering, data wrangling, data analysis and user support primarily focused on the Cloudera Hadoop platform, but in future extending to the cloud. The Sr. Big Data Engineer must have strong hands-on technical skills as well as being able to mentor and train other Engineers on conventional ETL and SQL skills with programming as well as data science languages such as Python and R, using big data techniques. The role will also play a role in defining and implementing Big Data Strategy for the organization along with driving implementation of IT solutions for the business.
Responsibilities:

  • Proactively analyze the business needs, profile large data sets and build custom data models andapplications to drive business decision making and customers experience
  • Build workflows that empower analysts to efficiently use data
  • Develop and extend design patterns, processes, standards, frameworks, and reusable components forvarious data engineering functions areas.
  • Perform requirements analysis, planning and forecasting for Hadoop data engineering/ingestionprojects
  • Design optimized Hadoop and big data solutions for data ingestion, data processing, data wrangling,and data delivery
  • Design, develop tune data products, streaming applications, and integrations on large-scale dataplatforms (Hadoop, Kafka Streaming, Hana, SQL server, Data warehousing, big data, etc) with anemphasis on performance, reliability and scalability, and most of all quality.
  • Identify, design, and implement internal process improvements: automating manual processes,optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for efficient extraction, transformation, and loading of data from awide variety of data sources
  • Build data tools for analytics and data scientist team members that assist them in building andoptimizing our product into an innovative industry leader.
  • Develop custom data models and algorithms
  • Identify opportunities for data acquisition
  • Peer review of the code developed by team members
  • Propose recommendations to streamline processes for efficiency and effectiveness.
  • Strong execution of solutions to meet client and the company's needs.
  • Work in multi-functional agile teams to continuously experiment, iterate and execute on data-driven product objectives.
  • Identify and resolve day-to-day issues to ensure continuous improvement.
  • Network with colleagues to share knowledge and gain new perspectives (Mgmnt/IC Track).
Qualifications
  • 7+ years of hands-on experience with big data
  • Bachelor's or advanced degree in a technology field or equivalent experience
Knowledge/Skills/Abilities Required:
  • Strong problem-solving skills with an ability to isolate, deconstruct and resolve complex data engineering challenges
  • Strong object-oriented programming skills
  • Ability to take complex data and communicate in a manner that is understandable to all audiences
  • Understands the evolving landscape of technology and its effect on clients and data products
  • Maximizes profitable growth by seeking efficiency in systems and processes
  • Delivers optimal solutions to meet client and company's needs
  • Ability to communicate effectively with wide range of audiences, both verbally and in writing
  • Demonstrated success in partnering with cross-functional departments and teams to achieve business objectives.
  • Ability to effectively network with colleagues to share knowledge and gain new perspectives.
  • Strong communication (verbal and written) and client service skills. Effective interpersonal, communication, and presentation skills applicable to a wide audience including senior/executive management, clients, and peers.
  • Responds well to a bottom-up management approach, supporting a culture of creativity and innovation.
  • Demonstrated understanding of the evolving landscape of technology.
  • Can effectively communicate business strategy and objectives.
  • Able to strike an effective balance between focus on strategic priorities and near-term operational priorities.
  • Ability to tailor communication style and content to the audience.
  • Knows how businesses work and is aware of how strategies and tactics work in the marketplace.
  • Understands how to translate client information into improved products and services.
Technical Skills
  • Knowledge of Programming/Scripting Languages including experience in Python/Java/Scala - Core; Must have experience in object-oriented programming concepts and Shell Scripting
  • Knowledge of Skills and Design Patterns including Statistics, Data Visualizations and Microservices preferred; Must have experience in REST API, Data Modelling and Performance Tuning
  • Knowledge of RDBMS/Databases including experience in ANSI SQL; Experience in Oracle/MySQL/Sybase; Knowledge of MongoDB (Object Stores) and Snowflake
  • Experience in Job Scheduling including ControlM, Crontab, Airflow or Autosys
  • Experience with Libraries including Python Pandas, Python ETL, Python or Java
  • Must have experience in Hadoop/Big Data including HDFS, Spark, Hive/Spark SQL, Sqoop and Data warehousing; Knowledge of HBase, Phoenix, Datameer (or other analytical tools), PowerBI (or other reporting tool i.e., Tableau) and SAS/Dataflex preferred
  • Experience in Cloud software including Azure/AWS/Google Cloud; Must have experience in Data Factory or other Cloud ETL tools and Cloud Data Lake (Storage); Knowledge of Databricks; Knowledge of Azure Batch, Delta Lake, Azure HDInsight, Cosmos DB, Azure EventHub and Synapse preferred
  • Knowledge of Backoffice Full Stack including Servlet, Spring Framework, HTML, Javascript, CSS, React and HighCharts preferred
  • Must have experience in OS software including Linux/UNIX
  • Must have experience in Project Management/Agile including SVN/GIT/GIThub; Knowledge of Azure DevOps and CICD preferred
  • Knowledge of Tools including Experience in IDE - IntelliJ/Eclipse/PyCharm/Visual Studio and DBeaver/SQL developer or other SQL development tools; Knowledge of Hortonworks/Cloudera - Ambari and Trifacta preferred
If this role is of interest, please contact Shannon today at

Job Requirements:
Qualifications • 7+ years of hands-on experience with big data • Bachelor's or advanced degree in a technology field or equivalent experience Knowledge/Skills/Abilities Required: • Strong problem-solving skills with an ability to isolate, deconstruct and resolve complex data engineering challenges • Strong object-oriented programming skills • Ability to take complex data and communicate in a manner that is understandable to all audiences • Understands the evolving landscape of technology and its effect on clients and data products • Maximizes profitable growth by seeking efficiency in systems and processes • Delivers optimal solutions to meet client and company's needs • Ability to communicate effectively with wide range of audiences, both verbally and in writing • Demonstrated success in partnering with cross-functional departments and teams to achieve business objectives. • Ability to effectively network with colleagues to share knowledge and gain new perspectives. • Strong communication (verbal and written) and client service skills. Effective interpersonal, communication, and presentation skills applicable to a wide audience including senior/executive management, clients, and peers. • Responds well to a bottom-up management approach, supporting a culture of creativity and innovation. • Demonstrated understanding of the evolving landscape of technology. • Can effectively communicate business strategy and objectives. • Able to strike an effective balance between focus on strategic priorities and near-term operational priorities. • Ability to tailor communication style and content to the audience. • Knows how businesses work and is aware of how strategies and tactics work in the marketplace. • Understands how to translate client information into improved products and services. Technical Skills • Knowledge of Programming/Scripting Languages including experience in Python/Java/Scala - Core; Must have experience in object-oriented programming concepts and Shell Scripting • Knowledge of Skills and Design Patterns including Statistics, Data Visualizations and Microservices preferred; Must have experience in REST API, Data Modelling and Performance Tuning • Knowledge of RDBMS/Databases including experience in ANSI SQL; Experience in Oracle/MySQL/Sybase; Knowledge of MongoDB (Object Stores) and Snowflake • Experience in Job Scheduling including ControlM, Crontab, Airflow or Autosys • Experience with Libraries including Python Pandas, Python ETL, Python or Java • Must have experience in Hadoop/Big Data including HDFS, Spark, Hive/Spark SQL, Sqoop and Data warehousing; Knowledge of HBase, Phoenix, Datameer (or other analytical tools), PowerBI (or other reporting tool i.e., Tableau) and SAS/Dataflex preferred • Experience in Cloud software including Azure/AWS/Google Cloud; Must have experience in Data Factory or other Cloud ETL tools and Cloud Data Lake (Storage); Knowledge of Databricks; Knowledge of Azure Batch, Delta Lake, Azure HDInsight, Cosmos DB, Azure EventHub and Synapse preferred • Knowledge of Backoffice Full Stack including Servlet, Spring Framework, HTML, Javascript, CSS..... click apply for full job details

Sr Big Data Engineer - Permanent - Remote

Nigel Frank International

Port Washington New York

United States

Information Technology

(No Timezone Provided)

Sr Big Data Engineer - Permanent - Remote
The Role
The Sr. Big Data Engineer will be the senior person responsible for big data engineering, data wrangling, data analysis and user support primarily focused on the Cloudera Hadoop platform, but in future extending to the cloud. The Sr. Big Data Engineer must have strong hands-on technical skills as well as being able to mentor and train other Engineers on conventional ETL and SQL skills with programming as well as data science languages such as Python and R, using big data techniques. The role will also play a role in defining and implementing Big Data Strategy for the organization along with driving implementation of IT solutions for the business.
Responsibilities:

  • Proactively analyze the business needs, profile large data sets and build custom data models andapplications to drive business decision making and customers experience
  • Build workflows that empower analysts to efficiently use data
  • Develop and extend design patterns, processes, standards, frameworks, and reusable components forvarious data engineering functions areas.
  • Perform requirements analysis, planning and forecasting for Hadoop data engineering/ingestionprojects
  • Design optimized Hadoop and big data solutions for data ingestion, data processing, data wrangling,and data delivery
  • Design, develop tune data products, streaming applications, and integrations on large-scale dataplatforms (Hadoop, Kafka Streaming, Hana, SQL server, Data warehousing, big data, etc) with anemphasis on performance, reliability and scalability, and most of all quality.
  • Identify, design, and implement internal process improvements: automating manual processes,optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for efficient extraction, transformation, and loading of data from awide variety of data sources
  • Build data tools for analytics and data scientist team members that assist them in building andoptimizing our product into an innovative industry leader.
  • Develop custom data models and algorithms
  • Identify opportunities for data acquisition
  • Peer review of the code developed by team members
  • Propose recommendations to streamline processes for efficiency and effectiveness.
  • Strong execution of solutions to meet client and the company's needs.
  • Work in multi-functional agile teams to continuously experiment, iterate and execute on data-driven product objectives.
  • Identify and resolve day-to-day issues to ensure continuous improvement.
  • Network with colleagues to share knowledge and gain new perspectives (Mgmnt/IC Track).
Qualifications
  • 7+ years of hands-on experience with big data
  • Bachelor's or advanced degree in a technology field or equivalent experience
Knowledge/Skills/Abilities Required:
  • Strong problem-solving skills with an ability to isolate, deconstruct and resolve complex data engineering challenges
  • Strong object-oriented programming skills
  • Ability to take complex data and communicate in a manner that is understandable to all audiences
  • Understands the evolving landscape of technology and its effect on clients and data products
  • Maximizes profitable growth by seeking efficiency in systems and processes
  • Delivers optimal solutions to meet client and company's needs
  • Ability to communicate effectively with wide range of audiences, both verbally and in writing
  • Demonstrated success in partnering with cross-functional departments and teams to achieve business objectives.
  • Ability to effectively network with colleagues to share knowledge and gain new perspectives.
  • Strong communication (verbal and written) and client service skills. Effective interpersonal, communication, and presentation skills applicable to a wide audience including senior/executive management, clients, and peers.
  • Responds well to a bottom-up management approach, supporting a culture of creativity and innovation.
  • Demonstrated understanding of the evolving landscape of technology.
  • Can effectively communicate business strategy and objectives.
  • Able to strike an effective balance between focus on strategic priorities and near-term operational priorities.
  • Ability to tailor communication style and content to the audience.
  • Knows how businesses work and is aware of how strategies and tactics work in the marketplace.
  • Understands how to translate client information into improved products and services.
Technical Skills
  • Knowledge of Programming/Scripting Languages including experience in Python/Java/Scala - Core; Must have experience in object-oriented programming concepts and Shell Scripting
  • Knowledge of Skills and Design Patterns including Statistics, Data Visualizations and Microservices preferred; Must have experience in REST API, Data Modelling and Performance Tuning
  • Knowledge of RDBMS/Databases including experience in ANSI SQL; Experience in Oracle/MySQL/Sybase; Knowledge of MongoDB (Object Stores) and Snowflake
  • Experience in Job Scheduling including ControlM, Crontab, Airflow or Autosys
  • Experience with Libraries including Python Pandas, Python ETL, Python or Java
  • Must have experience in Hadoop/Big Data including HDFS, Spark, Hive/Spark SQL, Sqoop and Data warehousing; Knowledge of HBase, Phoenix, Datameer (or other analytical tools), PowerBI (or other reporting tool i.e., Tableau) and SAS/Dataflex preferred
  • Experience in Cloud software including Azure/AWS/Google Cloud; Must have experience in Data Factory or other Cloud ETL tools and Cloud Data Lake (Storage); Knowledge of Databricks; Knowledge of Azure Batch, Delta Lake, Azure HDInsight, Cosmos DB, Azure EventHub and Synapse preferred
  • Knowledge of Backoffice Full Stack including Servlet, Spring Framework, HTML, Javascript, CSS, React and HighCharts preferred
  • Must have experience in OS software including Linux/UNIX
  • Must have experience in Project Management/Agile including SVN/GIT/GIThub; Knowledge of Azure DevOps and CICD preferred
  • Knowledge of Tools including Experience in IDE - IntelliJ/Eclipse/PyCharm/Visual Studio and DBeaver/SQL developer or other SQL development tools; Knowledge of Hortonworks/Cloudera - Ambari and Trifacta preferred
If this role is of interest, please contact Shannon today at

Job Requirements:
Qualifications • 7+ years of hands-on experience with big data • Bachelor's or advanced degree in a technology field or equivalent experience Knowledge/Skills/Abilities Required: • Strong problem-solving skills with an ability to isolate, deconstruct and resolve complex data engineering challenges • Strong object-oriented programming skills • Ability to take complex data and communicate in a manner that is understandable to all audiences • Understands the evolving landscape of technology and its effect on clients and data products • Maximizes profitable growth by seeking efficiency in systems and processes • Delivers optimal solutions to meet client and company's needs • Ability to communicate effectively with wide range of audiences, both verbally and in writing • Demonstrated success in partnering with cross-functional departments and teams to achieve business objectives. • Ability to effectively network with colleagues to share knowledge and gain new perspectives. • Strong communication (verbal and written) and client service skills. Effective interpersonal, communication, and presentation skills applicable to a wide audience including senior/executive management, clients, and peers. • Responds well to a bottom-up management approach, supporting a culture of creativity and innovation. • Demonstrated understanding of the evolving landscape of technology. • Can effectively communicate business strategy and objectives. • Able to strike an effective balance between focus on strategic priorities and near-term operational priorities. • Ability to tailor communication style and content to the audience. • Knows how businesses work and is aware of how strategies and tactics work in the marketplace. • Understands how to translate client information into improved products and services. Technical Skills • Knowledge of Programming/Scripting Languages including experience in Python/Java/Scala - Core; Must have experience in object-oriented programming concepts and Shell Scripting • Knowledge of Skills and Design Patterns including Statistics, Data Visualizations and Microservices preferred; Must have experience in REST API, Data Modelling and Performance Tuning • Knowledge of RDBMS/Databases including experience in ANSI SQL; Experience in Oracle/MySQL/Sybase; Knowledge of MongoDB (Object Stores) and Snowflake • Experience in Job Scheduling including ControlM, Crontab, Airflow or Autosys • Experience with Libraries including Python Pandas, Python ETL, Python or Java • Must have experience in Hadoop/Big Data including HDFS, Spark, Hive/Spark SQL, Sqoop and Data warehousing; Knowledge of HBase, Phoenix, Datameer (or other analytical tools), PowerBI (or other reporting tool i.e., Tableau) and SAS/Dataflex preferred • Experience in Cloud software including Azure/AWS/Google Cloud; Must have experience in Data Factory or other Cloud ETL tools and Cloud Data Lake (Storage); Knowledge of Databricks; Knowledge of Azure Batch, Delta Lake, Azure HDInsight, Cosmos DB, Azure EventHub and Synapse preferred • Knowledge of Backoffice Full Stack including Servlet, Spring Framework, HTML, Javascript, CSS..... click apply for full job details