University of Rochester Director for Educational Effectiveness

  • May 10, 2016 3:34 PM
    Message # 4012407

    Functional Title:       Director for Educational Effectiveness

    Department:            Deans’ Office – The College

    Supervisor:             Associate Dean of the College

     

    Position Description:

    The Director of Assessment, reporting to the Associate Dean of the College and with direction from the Assistant Dean and Executive Director of the Center for Excellence in Teaching and Learning, will play a critical role in the development and implementation of an assessment program in departments and programs across the College.  The director will coordinate and oversee assessment activities in the College and will work closely with the College’s Assessment Committee on a variety of activities and programs across all academic (undergraduate and graduate level) and non-academic departments. With support from the AS&E Office of Institutional Research, and working closely with the Center for Excellence in Teaching and Learning and other College partners, the director will also facilitate educational initiatives that emerge from assessment work.

     

    General Responsibilities:

    Understands, interprets and guides faculty and academic and non-academic administrators on assessment practices and procedures. Using independent judgment and experience, acts on behalf of the Dean in providing advice and identifying best practices for development and implementation of assessment plans in departments and programs.  Reviews accreditation standards and prepares, maintains and disseminates information related to accreditation for the College.  Leads the College’s Assessment Committee and participates in other committees related to student success.  Provides leadership, innovation and support for enhancing teaching and learning through the identification, evaluation and integration of effective educational innovations. 

     

    Specific Responsibilities:

    Assessment Plans (25%)

    • Works collaboratively with faculty, administrators and staff to develop goals and assessment protocols for the College as a whole and for each department in the College.  Drawing on theory and best practices, the director provides guidance to deans and academic departments regarding assessment plans and appropriate evaluation processes.
    • Provides leadership, guidance and advice to deans, faculty and administrators involved in the design, development and evaluation of instructional programs and majors.
    • Working in collaboration with departments, establishes appropriate student learning outcomes and indicators for each major.
    • Oversees strategic planning process for assessment in the College.
    • Designs and conducts special studies as needed in departments, using qualitative and quantitative research methods.
    • Serves as primary administrator in the assessment process between academic departments and the Deans’ Office.
    • Works in collaboration with the Assistant Provost for Academic Administration. 

     

    Accreditation (20%)

    • Monitor changes in accreditation standards and ensures compliance with Middle States Accreditation standards.
    • Ensures the College’s academic departments are prepared for accreditation review.  Interprets accreditation standards for departments and works with departments to address accountability expectations.
    • Establishes a repository for assessment plans (including student learning objectives and indicators).  Monitors data collection and documents uses of assessment in programs.
    • Leads the College in all matters related to accreditation. Serves as the College’s liaison to the University’s Provost Office for Middle States accreditation reviews.
    • Creates a web-based presence to assist in the assessment of academic programs for accreditation.

     

    Education Innovation (20%)

    • ·       Collaborates with faculty and staff to provide leadership, innovation and support for enhancing teaching and learning through the identification, evaluation and integration of effective educational innovations that emerge from and or incorporate assessment mechanisms.
    • ·       Develops and delivers workshops on educational innovations for faculty and graduate students. 
    • ·       Engages the academic community in the exploration, discussion and assessment of educational innovations. 

     

    Committee Staffing (10%)

    • Chairs the College’s committee on assessment.  Plays an active role in sharing with the committee strategies for developing and implementing assessments.
    • Serves as an active member of the University Committee for Educational Effectiveness.
    • Serves as an active member of the student retention committee. 
    • Participates in other relevant policy, project and planning committees as needed. 

     

    Current Research in Assessment and Educational Innovation (10%)

    • Actively engages in the field through scholarly reading and attendance at conferences and educational workshops. 
    • Researches and identifies new trends and techniques applicable for the College.

     

    Program Evaluation (5%)

    • ·       Collaborates with faculty to develop evaluation plans related to grant proposals, including educational benchmarking and outcomes. 
    • ·       Serves as a resource to assist in identifying program objectives in measurable terms, key indicators of success, plans and methodology for data collection and analysis, and developing formative and summative measures and timelines to monitor the successes and failures of programs.  

     

    Reporting and Analysis (10%)

    • ·       Works with senior administrative staff, assists in the preparation of institutional benchmarking reports, analyzing data both at the institutional and department level. 
    • ·       Prepares reports on the College’s progress on short and long term assessment goals.
    • ·       Serves as a central repository for department and program reviews.  Assists faculty members in transforming learning objectives to measurable outcomes. 
    • ·       Provides viability reports on new programs and new program initiatives.  Establishes metrics for new program reviews.
    • ·       In collaboration with AS&E stakeholders manages surveys such as the COFHE Senior Survey, Student Climate Survey, etc.

     

    Minimum Requirements: 

    Master’s degree in social sciences, educational research or related field required, Ph.D. is preferred. Five or more years of direct experience in higher education assessment preferred.  Significant experience in regional accreditation processes is preferred.  Knowledge of institutional research is helpful.  The ability to navigate multiple priorities and excellent organizational and communications (verbal and writing) skills are necessary. 

     

    This document describes typical duties and is not meant to limit management from assigning other duties as required.

  • May 31, 2017 1:01 PM
    Reply # 4863178 on 4012407
    Anonymous

    I am a collaborative engineering professional with substantial experience designing and executing solutions for complex business problems involving large scale data warehousing, real-time analytics based on Natural Language Processing (NLP), and reporting solutions based on intuitive architecture that effectively analyze and process petabytes of structured and unstructured data. I have an extensive background in architecting and developing large-scale data processing systems and serving as a subject matter expert in data warehousing solutions while working with a variety of database, analytical, and statistical based technologies. This experience in architecting highly scalable, distributed systems using different open source and specific vendor tools as well as designing and optimizing large, multi-petabytes of clustered data warehouses were primarily in the clinical/ medical research, academia, computer integrated manufacturing, military battlefield self-awareness systems, aerospace, and financial/investment systems. Based on this experience I was able to integrate state-of-the-art Big Data technologies from IBM and Oracle into the overall architecture and lead a multi-team of Senior Data Scientists, Software Engineers, and Enterprise Architects through the design, development, engineering testing, and implementation phases. I have been working in industry and academia with Analytics and Artificial Intelligence in designing and constructing neural networks to be used as computational models in machine learning based on NLP and linguistics. These models were used as the basis to support the execution of statistical applications developed in R and SAS languages, and algorithms based on discreet and continuous mathematics and inter-operability with rule-based application languages. I have used NLP as a means to examine and analyze patterns of data, statistics, and related content relative to their behaviors predicated on metamorphosis processes as well as to the next level which is providing human intelligence with a secondary self-awareness that is technology generated and in support of the primary self-awareness which is human intelligence. In summary, this means that as data and related content changes due to interactions with applications, human interactions, and streams of research, the use of these NLP applications become self-aware of these new patterns through interpretative learning. Based on this self-awareness information is derived that allows actions and recommendations to be readily added to these patterns and the extension of these patterns across other patterns by predictive and inferential models. The net result is that what we learn on a day-to-day basis by interacting with data, statistics, and content is exponentially and organically increased but the relevancy from this is meaningful information that works for us, for example, the ability to give back to us the missing chemical component or surgical technique for DNA molecular cell structure breakdown in order to address a specific cause to autism, or increase business competitiveness in your specific marketplace by niche. The application of NLP was the missing link to the work completed in Analytics.



    I have many years of experience and applied knowledge as a Senior Enterprise Technical Systems Architect, Technical Project Manager, Software Engineer and Data Scientist that addressed the SDLC of an enterprise and/or system from business process analysis, systems/software architecture, development, quality engineering testing, quality assurance and training, for example, with Natural Language Processing (NLP)/Analytics, Financial Systems, ERP, etc. My current fulltime position was with Oracle Managed Cloud Services (OMCS), the Oracle Corporation. I served in multiple and concurrent roles, for example, Senior Lead Integration Architect for Advisory Integration Services, Data Scientist, Product Development, and Advanced Cloud Technologies. The specific responsibilities that I had included advising OMCS Technical Operations and Service Delivery Managers on Cloud enabled Adapters and supporting technologies; consulting with OMCS Customers and/or 3rd Party Implementers relative to requirements and specific applied knowledge needed for integration of their environments with Fusion Applications; and collaborated and served as the conduit to Product Development Teams per adapter technology, Fusions applications and technologies for understanding the current status of releases and the supporting infrastructure and platform technologies. In addition, I was responsible for the architecture, deployment, and management of OMCS R&D Lab for Engineered Systems that included: Exalogic, Exadata, SuperCluster (SSC), and BigData (BDA) This management included platform enhancements, application and technology development for testing, deployment, and a training environment for OMCS Support Engineers. In parallel to my position with Oracle, I held several part-time academic positions, for example, Adjunct Professor at Syracuse University iSchool, Cornell University, Saint John Fisher College, Resident Senior Fellow at Rochester Institute of Technology at Saunders College of Business, and I served as Senior Advisor to the University Alliance for SAP Studies at the University of Wisconsin.  I currently serve as a professor for Penn State University’s Enterprise Architecture Graduate and Data Science Programs, and a Senior Fellow at Carnegie Mellon University as a Distinguished Lecturer for Systems/Software Architecture Development. In addition, an enormous amount of professional experience in the computer industry includes several key corporations where I also worked in parallel to my full-time positions in the 1980’s through 1990’s with major computer corporations: Control Data, HP, General Data, GE/Honeywell, Prime, Burroughs/UNISYS, Apple, Digital/Compaq, Xerox, Danka, EDS, CSC, Amdahl, Digital etc., as a means to enrich my Information Technology skills through applied knowledge.



© 2018 Assessment Network of New York, a 501(c)(3) organization

Powered by Wild Apricot Membership Software