SECURA Insurance

  • Big Data Engineer

    Location US-WI-Appleton
    Posted Date 3 weeks ago(4/3/2018 8:57 PM)
    Job ID
    Job Grade
    Grade 8
    Information Technology
  • Overview

    The Big Data Engineer is responsible for supporting the analysis, design and implementation of SECURA’s big data solutions. In addition you are responsible for coaching and mentoring our current Business Intelligence staff in understanding where big data solutions should and can be used.


    You will be working with SECURA’s business leaders and technology teams to deliver solutions that support our Actuary, Underwriting and Claims staff. You will perform detailed analysis of business problems and technical environments and use this data in crafting a solution that meets the business needs while adhering to best practice architecture.


    • Effectively learn, analyze, communicate, and implement big data concepts and technologies such as Hadoop, HDFS, Pig, Hive, Spark, NoSql, etc
    • Lead cross-functional business and technology stakeholders to elicit, analyze, document, and validate business and technical requirements
    • Design and develop code, scripts and data pipelines that leverage structured and unstructured data
    • Develop and recommend innovative, practical approaches to solving business and technical problem
    • Leads the analysis of the current technology environment to detect critical deficiencies and recommends solutions for improvement
    • Build robust and scalable data infrastructure (both batch processing and real-time) to support needs from internal users
    • Implement measures to address data privacy, security and compliance
    • Mentor other engineers on the team regarding technology and best practices
    • Collaborate with cross-functional teams to utilize the new big data tools
    • Design, develop and operationalize ETL/ELT programs to support the data needs of cross functional projects
    • Develops, manages, and maintains an enterprise Big Data strategy and roadmap
    • Leads the development and implementation of solutions utilizing agile methodology
    • Coordinates with enterprise architects to align to roadmaps and to understand the impact on the enterprise architecture
    • Participate in training and sharing technical knowledge with team members


    To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.



    • Bachelor’s Degree in Computer Science, related field, or equivalent work experience
    • 8+ years of IT Experience
    • 5+ years in implementing Java based solutions
    • 2+ years in implementing Big Data Solutions
    • Proficient in Big Data Technologies including Apache Spark, Atlas, Storm, Hive, Pig, HBase
    • Familiar with Data Modeling, Data Architecture and Data Governance concepts
    • Proficient in Database Concepts and Technologies including SQL Server and/or DB2


    • 3 years working in a P&C insurance company
    • Obtained HCA, HDPCD, HDPCA or relevant big data qualification


    Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
    Share on your newsfeed