Posted 2 hours ago. Have sound exposure to Retail … Other scripting skills, such as Python or Perl a plus, Expert knowledge on large-scale/distributed SQL, Hadoop, NoSQL, HBase, Columnar databases, Expert knowledge on Hadoop-related technologies, Hive, Impala, MapReduce, Spark, etc, Use hands-on programming skills, to design, code, and testing of complex distributed application components, will work with a team of technology and business data specialists to execute the technology functions required to establish data environments, develop data maps, extract and transform data, analyze and reconcile data errors and anomalies. Hadoop Developer is one of the most highly paid profiles in current IT Industry. Hadoop Developer is one of the most highly paid profiles in current IT Industry. Fully remote role for atleast the first quarter of 2021. Headline : Senior Hadoop Developer with 8+ years of experience in analysis, design, development and implementation of large-scale web-based applications using Big Data, Hadoop, Spark, NiFI, Map-Reduce, Storm, HIVE, HBase, Core Java, J2EE and related Technologies.Experience and strong knowledge on Apache Spark. Hadoop Developer resumes, templates and sections. You are looking for your dream job and need a cover letter? The specific duties mentioned on the Hadoop Developer Resume include the following – undertaking the task of Hadoop development and implementation; loading from disparate data sets; pre-processing using Pig and Hive; designing and configuring and supporting Hadoop; translating complex functional and technical requirements, performing analysis of vast data, managing and deploying HBase; and … For example, if you have a Ph.D in Neuroscience and a Master's in the same sphere, just list your Ph.D. Hadoop is the frontier of all IT technologies as it helps in various business processes. The same can be said for the person who would be taking care of you data storage needs as well. Identifies tasks and issues that may have an impact on service levels or schedules. ), Knowledge of Scrum and Agile methodologies, Third-party product integration (installation, configuration and customizations), System and data analysis, architecture, and design, Software development, configuration and customization, Work with Project Managers, Business Analysts, Business client and other technical teams to design, implement, test and maintain back-end tax processing solutions, Define new product features in conjunction with product management, and provide specification, SQL Development and data modelling, ideally in a Sybase or MS SQL environment, ETL applications such as Informatica or Pentaho, Database reporting applications, such as Business Objects, Programming experience with MapReduce/HDFS/Pid/Hive/Hbase is desirable, 1 + years of experience in one or more of the following EAI concept tools, Messaging protocols such as TIBCO or MQ-Series, ETL tools: Informatica, DataStage, Attunity, AbInitio, Teradata Utilities, TCP/IP Sockets based protocols and network programming, J2EE, .net (networking, physical design, performance, programming), Application Servers: JBOSS, Weblogic, Websphere, 1 + years of applications development and architecture utilizing the following, Java, C, C++, C#, Perl, PHP, Python, UNIX shell, SQL, HTML, etc, 1 + years of experience in one or more of the following operating systems and databases, Operating Systems: Windows, LINUX, UNIX and MVS (two or more required), Data Bases: Teradata, DB2, PostGreSQL, MySQL, Oracle (one or more required), Excellent technical and organizational skills, Minimum 7+ years of IT experience development with regard to Big Data Platform in must, Java is preferable and also should have UI experience, Flair for data, schema, data model, how to bring efficiency in big data related life cycle, Develop solutions for key business initiatives ensuring alignment with future state analytics vision, Engage constructively with project teams to support project objectives through the application of sound architectural principles, Develop and validate that the proposed solution architecture supports the stated & implied business requirements of the project, Utilize statistical, mathematical, predictive modeling as well as business strategy skills to build the algorithms necessary to ask the right questions and find the right answers, Designs experiments, test hypotheses, and build models, Develop solutions to loosely defined business problems by leveraging pattern detection over potentially large datasets, Propose innovative ways to look at problems by using data mining, Validate findings using an experimental and iterative approach, Communicate and present findings, orally and visually in a way that can be easily understood by their business counterparts, Summarize and present findings; adjust the content and level of detail to appropriate level of the audience, Use strong business, problem solving skills and programming knowledge to quickly cycle hypotheses through discovery, Work closely with clients, clinicians, data stewards, and other business and technology leaders to frame problem definition and potential solutions, 1+ year Hadoop environment development experience, Experience working on Big Data tools (for example: Hadoop, Hive, Spark, Pig, Splunk etc. Extensive knowle... Related: Computers and Technology,Software Engineering. Hadoop Explained – learn how Hadoop works and what it does in this book, which also offers tips for successfully using Hadoop to manage large amounts of data . Experience of programming with various components of the framework, such as Impala. However, more complex assignments may require closer supervision and assistance, Designs, codes, tests, and debugs programs of varying degrees of complexity, Works directly on application/technical problem identification and resolution, including off-shift and weekend support functions, Works independently on complex modules that may be used by one or more programs or applications, Works independently on complex projects that may span multiple infrastructure components, Under the direction of more senior staff, assists in the development of major application modules and programs, Participates in integrated testing and user acceptance of application or infrastructure components, Works with vendors on the integration of purchased application and/or infrastructure solutions, Assists less senior staff with logic problems and interpretation of specifications, Fully knowledgeable of programming languages appropriate to the platform supported, program design and specification development, programming logic, logic diagrams, basic system analysis techniques, testing, debugging, documentation standards, file design, storage and internal systems, Designs and implements processes, or process improvements, to aid in development and support, Bachelor’s Degree in Computer Science, Information Technology, M.I.S., Analytics, Statistics, Mathematics, or Computer Technology related field, 2+ years of application development or support experience, 1+ year of experience in relational data base, Knowledgeable of the latest technology in programming languages; computing hardware and software; and current development processes and tools, 2 to 4 years of programming, integration or infrastructure experience, Knowledge of Caterpillar policies and procedures, and a general understanding of Caterpillar’s organization, Flexibility in adapting to team requirements, Ensuring adherence to defined process & quality standards, best practices, Ensuring high quality levels in all deliverables, Adhere to team’s governing principles and policies, Participate in mandatory training and cross functions across team, Demonstrate enthusiasm and zeal to acquire domain knowledge, Participate and contribute to Team/Project Reviews, Actively involved in Process improvements and automations, 2+ years of experience in design and implementation of big data applications using Hadoop technologies such as Spark SQL, Spark Data Frames, Hive, Sqoop, Oozie and Kudu, Hands on experience with Python and Scala are nice to have, 3 years working experience in Java/J2EE and Object Oriented programming, Experience working with Open source frameworks, Experience in job scheduling tools like Autosys, Experience in Banking or financial services domain, Participation in meetings with the customer, Experience/Knowledge of Hadoop generally, and specifically HDFS & HIVE table definitions, Knowledge of database principles such as denormalisation, data types, data integrity.Knowledge of Pentaho or Informatica (ideal), Understanding of logical data modelling (LDM) and mapping data to LDM’s, 6 or more years of relevant industry experience, Ability to deploy and maintain multi-node Hadoop cluster, Extracting and exporting data using Sqoop from sources such as SAP HANA, AS400, Oracle and SQL Server, Developing data pipelines using Hive and Impala, Shell scripting experience to manage the data pipelines, Basic understanding of security concepts on a Hadoop platform – Kerberos, Sentry, AD Groups, etc, Kafka is a good to have, since we have some upcoming use cases, Knowledgeable in techniques for designing Hadoop-based file layout optimized to meet business needs, Experience with NoSQL Databases – HBase, Apache Cassandra, Vertica, or MongoDB, Able to translate business requirements into logical and physical file structure design, Ability to build and test rapidly Map Reduce code in a rapid, iterative manner, Web Development: HTML5, CSS3, JavaScript, JSON, XML, J2EE Technologies: JSP / Servlets, EJB3, JMS, JDBC, JMX, JMS, Web Services: SOA, XML, XSL, SOAP, REST, Spring MVC, Spring Boot, Application Development: Python, Java, Ruby, Data-Layer Development: MySQL, PostgreSQL, NoSQL, Hadoop: HDFS, Hive, Spark, HBase, Impala, Apex, Sound understanding of continuous integration & continuous deployment environments, Solid understanding of application program interfaces (APIs), messaging software and interoperability techniques and standards, Strong analytical skills with a passion for testing, History of open source contribution is a plus, Bachelor’s degree in Computer Science or Information Systems or equivalent practical experience. Be knowledgeable in all NGC HIPAA compliance requirements and proactively address any HIPAA concerns. Share: Leave a Reply Cancel reply. Profile: Hadoop Stack Developer and Administrator “Transforming large, unruly data sets into competitive advantages” Purveyor of competitive … Completed testing of integration and tracked and solved defects. Make sure that you are inputting all the necessary information, be it your professional experience, educational background, certification’s, etc. Let us check out the raise in the salary trends for Hadoop developers compared to other profiles. Become knowledgeable on the HIPAA policies and procedures for the program and ensure awareness of HIPAA breach process, Bachelor's degree in computer science or related degree and 15 years of related work experience, Master’s degree and 13 years of experience or 19 years of related work experience to satisfy the degree requirement, 2+ years as a Hadoop Developer with hands-on experience Hadoop - Hadoop ecosystem, Java MapReduce, Pig, Hive, Spark, and Impala, 3 years of proven experience working with, processing and managing large data sets (multi TB scale), 3 years proven experience in ETL (Syncsort DMX-h, Ab Initio, IBM - InfoSphere Data Replication, etc.