Data Engineer
Job Summary:
The Data Engineer is responsible for designing, delivering, and supporting data solutions that align with the enterprise information strategy of the organization.  The Data Engineer will serve as a subject matter expert providing a high level of proficiency to the enhancement and support of the existing or new Brandywine Global Investment Management Enterprise Data Hub components including; design of the hub data structures, design and coding of the data loaders, and design and deployment of data integration tools. The Data Engineer will work independently and with a team on tasks and projects. Additionally, they are responsible for delivering high quality business solutions and meeting aggressive business timeframes.

This position requires an individual who possesses the personal attributes and the professional experience consistent with the Firm’s high standards of conduct and performance. In performing the job, the individual must demonstrate behaviors consistent with the company “Core Values” including:

Act with Integrity
  • Demonstrates values and ethics aligned to BGIM and is looked to for guidance on standards and norms
  • Follows through on commitments
  • Viewed as direct and truthful
  • Shows consistency between words and actions
  • Keeps confidences
Take Ownership
  • Feels a sense of personal responsibility in seeing tasks to completion
  • Works and thinks like a team member – owns their responsibilities while also considering the enterprise impact
  • Takes personal accountability for decisions, actions and outcomes.
  • Takes an active role in managing their own career development
Be Curious. Challenge Conventional Thinking
  • Introduces new ways of looking at problems
  • Experiments and finds new solutions
  • Has the courage to push back and ask questions that advance the group’s thinking
Debate with an Open Mind
  • Uses compelling arguments in representing own interests while actively seeking to understand different perspectives
  • Steps up to address difficult issues, says what needs to be said
  • Champions an idea or position despite dissent or political risk
Build Strong Diverse Relationships
  • Relates openly and comfortably with diverse groups of people. This includes diversity in the broadest sense – diversity of thought and functional expertise in addition to race, gender, ethnicity, sexual orientation.
  • Builds effective formal and informal relationships inside and outside the organization – including co-workers, clients, vendors and parent company, Franklin Templeton.
  • Draws upon multiple relationships to exchange ideas, resources and know-how
  • Embraces the spirit of collegiality, mutual respect and teamwork
Strive For Balance
  • Prioritizes business needs with an understanding personal realities
  • Understands and considers competing agendas and priorities within the firm when making decisions
Primary Responsibilities:
  • Develop, and deploy scalable Enterprise Data Solutions (Enterprise Data Warehouses/Hub/Lake, Data Marts, ETL/ELT workloads, etc.) using MPP cloud databases Snowflake data warehouse and various RDBMS like Oracle, PostgreSQL, SQL Server, MySQL etc., 
  • Work closely with the Data Architect to design data models for OLAP/OLTP systems using various modeling techniques like Ralph Kimball Dimensional modeling, STAR schema, snowflake schema, Slowly Changing Dimension (SCD), Temporal Model, 3NF, Entity Model, Data Vault methodology etc., for various business domains like CRM (Customer Relationship Management), Investment Portfolio Management and Accounting, Trading, Ledger management, Cash database, Investment performance and attribution, Quantitative Trading etc.,
  • Build Data Lake design pattern Raw, Curated and Analytic layers on Snowflake data warehouse and AWS S3 storage and implement ELT/ETL framework for Data Ingestion.
  • Support the execution/implementation of Data Strategy, Enterprise Data Hub/Data warehouse, Data Quality, MDM, Data Governance, Business Intelligence, Advanced Analytics (AI/ML) etc.,
  • Build Data solution layer using Big Data Cloud Architecture to empower Business users  to build Analytical Data Apps/Self-service tools using Business Intelligent tools, API, Machine Learning (ML) /Artificial Intelligence (AI) technologies/frameworks etc.,.
  • Responsible for implementing and managing Brandywine Enterprise Data Hub/Lake and Data Architecture.
  • Designs Data Migration processes to move data from various on-premise/cloud Data sources like Oracle, PostgresSQL, SQL Server, CRM etc., to Cloud Data Warehouse Snowflake.
  • Interact with all levels of users including portfolio managers, traders, portfolio compliance, analysts, performance and operations team members, as well as other technologists to gather requirements for various data needs to build and execute Brandywine Data Repository (Snowflake), Proprietary Applications and Business Intelligence (BI) Opportunities to support various Brandywine business groups.
  • Conceptualize and develop novel reports, dashboards & KPI scorecards for various Brandywine business groups to provide data insights using Business Intelligence (BI) tools like MicroStrategy, Power BI.
  • Deploy rich graphical visualizations in dashboards to create effective views of data such as: Line Charts, Bar Charts, Pie Charts, Scatter Plots, Histograms, Tree Maps, Heat Maps, Area Charts, Gantt Charts, Bullet Graphs in MicroStrategy, Power BI.
  • Through expert knowledge, translate high level business requirements into data models in alignment with organizational standards, strategic direction, and industry best practice.
  • In collaboration with the architects, application developers and product owners, design the data structures, schemas, and environments (DEV/QA/PROD/DR) of the Enterprise Data Hub to provide the highest level or availability and performance.  
  • Design and build the data loaders/pipelines that feed the Enterprise Data Hub, and monitor for optimal performance.
  • Participate in the selection of an appropriate toolsets (including data integration, data management, and business intelligence) to support the growing needs of the Enterprise Data Hub.
  • Optimize the SQL, stored procedures and views of the Enterprise Data Hub for increased performance as requested
  • Executes business analysis, design, development, testing, and implementation of new data added to the Enterprise Data Hub from CRM API, Oracle, SQL Server, and Postgres database technologies at an expert level.
  • Make recommendations to improve data architecture standards, guidelines, and processes to ensure data quality
  • Contribute to the development, testing and implementation of strategic business solutions.
  • Provide root cause analysis and resolution for production defects
  • Work independently as well as thrive in a team-oriented environment.
  • Be flexible and responsive to rapid change and manage priorities accordingly.
  • Stay informed and abreast of new developments within the industry and implement solutions based on DevOps committee standards.
Job Specifications:
  • Strong proficiency with relational database design and SQL programming using Snowflake, MS SQL Server and Oracle. 
  • In-depth experience with cloud technology providers (AWS/Azure)
  • Technical expertise with industry standard data integration / ETL (Extract, Transform, and Load) tools
  • Demonstrated knowledge of data management best practices, concepts and tools, including data governance, data warehouse, data integration, and BI technologies.
  • Ability to work on multiple projects, manage user expectations and complete work requests within the stated or expected deadlines.
  • Technical expertise with scheduling software such as Autosys.
  • Proficient with business intelligence/reporting packages such as PowerBI, MicroStrategy, etc.
  • Proficient with various enterprise application architectures including SOAP, JSON, and message-based application communication.
  • Thorough knowledge in SDLC cycle, both in waterfall & agile mode of development
  • Working knowledge of key buy-side investment management processes such as portfolio management, trading, portfolio compliance, pricing, analytics, client service, and operations including STP.
  • Strong Analytical and problem-solving skills.
  • Excellent verbal and written communication skills.
Other Duties:
  • Provide occasional off-hours support as needed to maintain business service levels.
  • Participate in Disaster Recovery and Business Continuity planning and annual DR tests (Saturday and Sunday).
  • Support for corporate and regulatory audits.
  • Design and build Brandywine proprietary application based on the various Brandywine business group requirements using technologies like Autosys,.Net, C#, Perl script.
  • Write Oracle PL/SQL, Autosys, Net to implement the business logic for Investment Portfolio Management and Accounting, Trading, Ledger management, Cash database, Investment performance and attribution.
  • Customize vendor Investment Accounting product Eagle STAR/PACE to calculate daily portfolio valuation, stock pricing, trading, portfolio performance returns for various Brandywine Investment products.
  • Other duties as assigned.
  • Bachelor degree required.  Degree in Computer Science, Engineering or Mathematics strongly desired.  Advanced degrees a plus.
  • Additional certifications in investment management or database technologies (e.g., CFA designation, SnowPro certification) a plus.
  • Approx. 5+ years of data application development experience.
  • Working knowledge of Eagle Pace data repository and portal a plus.
Brandywine Global Investment Management, LLC is an Equal Opportunity Employer