Requested Update Report for Dominica
Prepared September 2016
In October 1997, the National Committee on Foreign Medical Education Accreditation (NCFMEA) initially determined that the accreditation standards used by the Dominica Medical Board (DMB) to evaluate medical schools in Dominica were comparable to those used in the United States. The NCFMEA reaffirmed its prior determination that the standards and processes used by the country for the evaluation of its medical schools remained comparable in 2001 and 2007. The NCFMEA requested that the country submit periodic reports describing its continuing accreditation activities. Dominica's next petition for continued comparability was reviewed at the Fall 2013 NCFMEA meeting. Due to concerns raised at that meeting, the Committee deferred its decision concerning the country's comparability pending the review of additional information. At the Fall 2015 NCFMEA meeting, the Committee reviewed the additional information that was provided by the country in response to the Fall 2013 concerns and again deferred a decision regarding the country's comparability. At that time, the country was requested to provide additional information and documentation for review at the Fall 2016 NCFMEA meeting related to four remaining areas of concern. The country's responses to the Committee's Fall 2015 concerns are the subject of the current staff analysis.
Summary of Findings
The country has addressed the four areas of concern specified in the report, and no additional information is requested.
Update the solidification of the processes coordinated between the Commission on Higher Education (CHED) and PAASCU about monitoring and annual report data.
On November 12-14, 2015, the Dominica Medical Board (Board) conducted a focused site visit to Ross University School of Medicine (RUSM) at its Miramar, Florida location. The focused site visit was the recommended follow-up visit to the 2014 focused site visit. Exhibit 1 is the final Team Report of the Focused Site Visit that was conducted in November 2015. The Team Report reflects the rigor and thoroughness with which the Board assesses a medical school’s compliance with the Board’s accreditation standards. As described in the site visit report, the 2015 focused site visit team concentrated on RUSM’s progress with respect to certain standards where the Board had previously found “compliance with monitoring”. Those standards related to the following areas: the Board-imposed enrollment cap, mission and objectives, governance and administration, clinical programs, curriculum, medical students, and academic advising. As part of the 2015 focused site visit, the focused site visit team also considered RUSM’s performance on certain outcomes measures that the Board had requested RUSM to supply in Spring and Fall 2015. The two data submissions allowed the Board to monitor and assess RUSM’s performance with respect to those measures and to use such assessment in its evaluation of RUSM’s compliance with standards it is monitoring. In certain areas examined, the focused site visit team found that RUSM is in compliance with pertinent Board standards. In remaining areas, it found that although RUSM is in compliance with pertinent Board standards, ongoing monitoring is appropriate because RUSM’s implementation of relevant policies, processes, resources, and systems is ongoing, and the focused site visit team believes that ongoing monitoring, including regular collection and review of relevant metrics, will facilitate the Board’s ability to assess whether such steps are succeeding. To carry out such ongoing monitoring, the site visitors recommended that, as part of the reaccreditation review scheduled to occur in 2017, the Board follow up on standards where the focused site visit team recommended compliance with monitoring and continue to require RUSM to produce the data that the Board requested after the 2013 focused site visit. For more information about the focused site visit team’s findings and recommendations, we refer you to the full text of Exhibit 1. On June 28, 2016, the Board approved the site visit team's report and adopted the report's findings and recommendations. In addition to the focused site visit, during the past year the Board has conducted several site visits to clinical locations. In June 2015, the Board conducted clinical site visits at two locations in the Chicago area (i.e., Mount Sinai Hospital, Saint Anthony Hospital). Exhibit 2 contains the related site visit reports. In August 2015, the Board conducted clinical site visits at several locations in the New York area (i.e., Brookdale University Hospital and Medical Center, St. John’s Episcopal Hospital, Jamaica Hospital Medical Center, New York Methodist Hospital, South Nassau Communities Hospital). Exhibit 3 contains the related site visit reports. In May and June 2016, the Board conducted clinical site visits at two locations in California (i.e., Kern Medical Center, California Hospital Medical Center), two locations in Connecticut (i.e., Danbury Hospital, Norwalk Hospital) and two locations in Maryland (i.e., Prince George’s Hospital Center, St. Agnes Hospital). Exhibit 4 contains the site visit reports related to the California, Connecticut, and Maryland clinical site visits. Exhibit 5 contains the site visitors’ summary findings and recommendations with regard to the clinical sites visited in 2015 and 2016.
Analyst Remarks to Narrative
In the Fall 2015 ED final staff analysis, the Dominica Medical Board (DMB) was requested to provide information related to a focused site visit to the Ross University School of Medicine (RUSM) that was scheduled to be conducted by November 2015. Information was requested regarding not only the 2015 focused site visit, but also regarding the Board's policies regarding decisions to grant accreditation despite continued areas of noncompliance with the DMB's standards (Final Staff Analysis, p. 2). This information was requested due to the number of ongoing focused on-site reviews being conducted at RUSM. In 2012, the DMB conducted a reaccreditation review at RUSM. As a result of that comprehensive review, the DMB identified a number of areas of non-compliance with the DMB standards, but granted the institution renewed accreditation for a period of five years. As a follow-up to the 2012 comprehensive review, documentation provided by the DMB indicates that it conducted subsequent focused site visits related to the areas of non-compliance at RUSM in 2013 and 2014 (Ex. 1, p. 11). The DMB conducted a third focused site visit related to the 2012 areas of noncompliance at RUSM's administrative offices in Miramar, Florida on November 12-14, 2015. 2015 Focused Site Visit In response to the concerns raised in the 2015 ED Final Staff Analysis, the DMB provided a copy of the 2015 RUSM focused on-site review report (Ex. 1). According to the report, the 2012 reaccreditation review identified findings at RUSM related to compliance in nine areas of the DMB standards, and the 2015 focused site visit examined the institution's compliance related to six remaining areas of the standards (Ex. 1, p. 4). The 2015 site visit team consisted of two members of the DMB, a medical school vice dean, and a medical school professor. Two attorneys were also present for the visit. The 2015 on-site review report stated that the RUSM had made "significant improvements" in its compliance with the DMB's standards. All of the previous areas of noncompliance received a team recommendation of compliance, but with an added condition of ongoing monitoring. The on-site review report stated that the additional monitoring was necessary because many of the institution's initiatives related to its compliance had only recently been implemented and needed ongoing assessment in order to determine their effectiveness. The areas in need of ongoing monitoring will receive increased scrutiny during RUSM's next scheduled comprehensive reaccreditation review in 2017. The on-site review report also indicated that the team reviewed selected RUSM outcomes data. Since the 2013 focused site visit, RUSM has been submitting two reports per year related to student outcomes. The 2015 review report indicated a 97.3% first time pass rate on the USMLE Step 1, with an average score of 219.4. It indicated an 81.6% pass rate on Step 2 CK, with an average score of 224.0. The Step 2 CS had an 87.6% pass rate. The residency attainment rate (inside or outside the match) was 87.8%. The six-year graduation rate for RUSM's September 2009 cohort was 72.8% (Ex. 1, pp. 4-5). In addition to the 2015 focused site visit to RUSM's administrative offices, the DMB also conducted site visits to a number of clinical sites during 2015-2016. The agency provided site visits reports and a summary report related to those visits (Exs. 2, 3, 4, and 5). The summary report indicates a need for additional monitoring of clinical sites in several areas related to compliance with the DMB standards, as well as DMB processes and procedures (Ex. 5, p. 2). Board Policies The country did not provide any information regarding its policies related to an institution's continued accreditation in the face of ongoing findings of noncompliance, as was requested in the 2015 ED staff analysis. Additional information is again requested in this area. Staff accepts the information that the country has provided regarding the DMB's 2015 focused site visit to RUSM. However, additional information is requested regarding the agency's policies related to continued accreditation in the face of ongoing findings of noncompliance with the DMB's standards.
In the course of evaluating a medical school’s compliance with the Dominica Medical Board (“DMB”) standards, if the DMB identified areas of noncompliance, the DMB would determine an appropriate accreditation action based on the facts of the situation and would expect the medical school to come into compliance in a timely manner (i.e., generally within no longer than 24 months). As described in the Standards and Procedures for Certification of Medical Education Programs, Exhibit A, “[a] school must promptly correct significant deficiencies in compliance with accreditation standards. Failure to do so will constitute grounds for adverse action unless the period for achieving compliance is extended for good cause. Unless an exception is made, the specified period may not exceed 24 months.” The draft staff analysis states that in 2012, as part of a reaccreditation review of Ross University School of Medicine (“RUSM”), the DMB identified a number of “areas of non-compliance” with DMB standards. We want to emphasize that there seems to be a misunderstanding about the nature of the DMB’s findings and conclusions in 2012 and subsequent focused site visits. The DMB has NOT made ongoing findings of noncompliance with its standards regarding RUSM. Rather, the DMB has made determinations that RUSM was “in compliance with a need for monitoring” for a number of its standards (see details below). A finding of “in compliance with need for monitoring” is NOT the same as a finding of noncompliance. The DMB’s approach to evaluating RUSM’s compliance with DMB standards (i.e., for each standard, making a finding of “in compliance,” “in compliance with a need for monitoring,” or “noncompliance”) is consistent with the approach taken by the Liaison Committee for Medical Education (“LCME”), although the LCME has recently adopted slightly different terminology: “satisfactory,” “satisfactory with a need for monitoring,” “unsatisfactory.” In the 2012 reaccreditation review, the DMB identified several standards with respect to which, although RUSM was in compliance with the standards, such compliance should be monitored in an ongoing manner. The DMB refers to those standards identified for ongoing monitoring as the “2012 Standards to Monitor.” The DMB prescribed ongoing monitoring of the “2012 Standards to Monitor” because, at the time of the 2012 reaccreditation review, RUSM had developed new policies, processes, resources, or systems in many of the identified areas, and the DMB wanted to collect relevant data and make appropriate assessments as to whether the desired outcomes were being achieved with respect to those policies, processes, resources, and systems. When the DMB voted to grant RUSM a five-year reaccreditation in 2012, it impressed upon RUSM the need to make substantial improvements in relation to the “2012 Standards to Monitor,” and it decided to monitor RUSM’s progress by conducting a limited site visit in 2013 with respect to the “2012 Standards to Monitor.” Subsequent limited site visits were conducted in 2014 and 2015. This approach is consistent with that taken by the LCME, which may conduct one or more limited site visits to a medical school, focusing on specific issues, between its regular reaccreditation visits. During the limited site visit in 2014, the site visitors found RUSM to be “in compliance” with certain of the “2012 Standards to Monitor”. That is, the DMB determined that RUSM had shown that desired outcomes were being achieved and that monitoring was no longer necessary. In connection with the 2015 focused site visit, the site visitors identified additional “2012 Standards to Monitor” with respect to which RUSM had made significant improvements and where desired outcomes were being achieved. For those “2012 Standards to Monitor”, the team found “compliance” with no need for monitoring. From the original set of “2012 Standards to Monitor,” the 2015 focused site visitors found that a limited number of standards remained “in compliance with a need for monitoring” (i.e., IS-8, ED-2, ED-8, ED-30-A, ED-46, ED-47, ED-48, MS-10, MS-18, and MS-19). The DMB believes that ongoing monitoring is needed with respect to those standards because, although RUSM has appropriate policies, procedures, resources, or systems in place and is therefore in compliance with the standards, some initiatives have been recently implemented or modified in response to ongoing input from the DMB. While trends in data related to those policies, processes, resources, and systems are positive and aligned with desired outcomes, more time and data will allow better assessment of the impact of these changes. Although trends in data are positive and aligned with desired outcomes, the DMB has conducted ongoing monitoring visits to RUSM because the DMB believes strongly that accreditation and quality assurance are not only about compliance with standards; they are also about institutional self-improvement. Like the LCME, the DMB embraces institutional and programmatic self-improvement as a core component of accreditation and quality assurance. See LCME, “About,” http://lcme.org/about/ (“LCME accreditation is a voluntary, peer-reviewed process of quality assurance that determines whether the medical education program meets established standards. This process also fosters institutional and programmatic improvement.”) (last visited August 24, 2016). The DMB has always taken seriously its role with respect to both the assessment of RUSM’s compliance with DMB standards and the promotion of RUSM’s efforts to improve continuously its medical degree program. The DMB ultimately has the best interests of RUSM students in mind, and it believes that its ongoing monitoring of RUSM, even while RUSM has been in compliance with DMB standards, has benefited those students through quality improvements to the medical degree program and improved student outcomes.
Analyst Remarks to Response
In response to the draft staff analysis, the country emphasized that the during the course of the RUSM's 2015 focused site visit, the institution was found to have come into compliance with the DMB's standards and also emphasized that compliance with the need for additional monitoring differs from a finding of non-compliance. The country reiterated that the need for additional monitoring at the RUSM was due to the recent implementation of changes that require additional monitoring in order to observe their effectiveness. As requested, the country provided a copy of its certification standards and procedures document (Ex. A), which addresses certification decisions (p. 22) and re-evaluation and monitoring (pp. 22-23). The DMB standards specify that the DMB will review both documents furnished by the institution and the report of the on-site review teams in reaching a certification decision. Areas of non-compliance with the DMB's standards must be corrected within 24 months. Progress reports may be required to address specific areas of concern. Following submission of progress reports, the DMB may take appropriate action such as accepting the report, receiving the report as information, defer action and request additional information, or decline to accept the report and request another report. If strong concerns are raised, the DMB may request additional information, arrange for a focused on-site review, or arrange for a new comprehensive review of the institution. The DMB grants five-year periods of certification. If at the end of the five-year period the DMB determines that it needs additional information to make a certification decision, the DMB may extend the grant of certification for one additional year. During the one-year period, the DMB will collect additional information and make a certification decision before the year expires. Staff accepts the agency's narrative and supporting documentation, and no additional information is requested.
Changes to the database presentation, to include data on the percentage of students in the clinical site tracks;
In January 2016, the Board updated and otherwise revised its database document. The revised database, which will be utilized by the Board in connection with the reaccreditation review of RUSM scheduled to occur in 2017, is attached as Exhibit 6. Among many other things, the revised database document (i) solicits information about the process the medical school uses to assign medical students to one of several clerkship tracks and (ii) asks the medical school to describe the process by which a student may request an alternative assignment and identify on what basis the medical school will permit a student to be reassigned. See Exhibit 6, Section III. Medical Students, page 9 of 35. In addition to the database document, the Board continues to require RUSM to produce the data that the Board requested after the 2013 focused site visit. See Exhibit 7 for a copy of the 2016 data request; specifically, item #1 asks RUSM to identify the “percentage of total slots that are tracked rotations” and the “percentage of slots in each core clerkship that are tracked rotations”, and item #2 asks RUSM to identify the “number of clinical slots available in each core clerkship and number of tracked clinical slots available in each core clerkship.” The collection of such data enables the Board to compare RUSM’s performance on specific measures from year to year. See e.g., Exhibit 8, pp. 1-2, for the Board’s longitudinal analysis of RUSM’s responses to the data request in January 2014, October 2014, April 2015, and September 2015. As described in that summary document, the percentage of total slots that are tracked rotations increased from 49.6% in September 2013 to 98.2% in September 2015. The Board expects RUSM’s 2016 data submission to show that 100% of third-year medical students are now entering track rotations for their core clerkships. In accordance with the Board’s metric data request (see next question), RUSM will supply the relevant data to the Board in September 2016.
Analyst Remarks to Narrative
In the Fall 2015 final staff analysis, the country was requested to provide additional information as to how the DMB uses data it collects from RUSM, including information about how it analyzes the data to determine compliance with its standards for purposes of continued accreditation. In response to the concerns raised in the Fall 2015 final staff analysis, the country reports that the DMB revised its overall database document (Ex. 6). The revised document requires institutions to provide information on the: medical school's background; institutional setting; educational program; medical students; faculty; educational resources; and required courses and clerkships. Included are questions regarding class size, adequacy of teaching resources, and the process for assigning students to campuses, locations, and clerkship tracks (Ex. 6, Section III, p. 9). In addition to the overall database document, the DMB requested that RUSM also provide specific supplemental data reports biannually as a result of the 2013 focused site visit (Ex. 7). As more data was accumulated, those data requests changed from biannual to annual in 2016. In the supplemental reports, the institution was required to submit responses to 21 questions regarding: percentages of tracked rotations; clinical clerkship slots; core clerkship completions; NBME Subject Examination scores; grade distributions; visits to clinical clerkship sites; responses to calls received from students; results of a student satisfaction survey; USMLE pass rates; residency attainment rates; and completion, transfer, and attrition rates. Using the supplemental data collected from RUSM in response to the focused reviews, the DMB has conducted a longitudinal analysis of the data that RUSM submitted from September 2013 to September 2015 (Ex. 8). That data was provided by RUSM in response to the 21 supplemental questions required by the document shown in Ex. 7. In most areas, RUSM's numbers or percentages improved, in some cases drastically, although in many instances the numbers still remained relatively poor despite showing an improvement. For example, the number of tracked clinical slots increased markedly, although the percentage of students completing all core clinical clerkships within 60 weeks or prior to taking the USMLE Step 2 CK and CS remained rather low (Ex. 8, pp. 1-3). Staff accepts the country's narrative and supporting documentation regarding the data that the DMB is collecting on an ongoing basis from RUSM, and no additional information is requested. The country reports that that RUSM's 2016 supplemental data will be available for discussion at the Fall 2016 NCFMEA meeting, although it was not available at the time that the country submitted its current report for staff analysis.
Third-year comparative metric data from Ross University School of Medicine for years 2013/2014/2015; and
Exhibit 8 summarizes data that RUSM provided to the Board in January 2014, October 2014, April 2015, and September 2015. In the past, the Board has requested two full data submissions each calendar year. Now that the Board has multiple years of complete data and in light of the upcoming reaccreditation review, which will require a substantial data submission, the 2015 focused site visitors recommended that the Board require RUSM to provide the data submission once in 2016. See Exhibit 7 for a copy of the 2016 data request. Following discussion between the Board and RUSM, the Board agreed to accept RUSM’s 2016 data submission in two parts based on when such data will become available. RUSM will supply responses to certain data requests (i.e., Exhibit 7 items #6 and 10-21) in July 2016 and will supply responses to the remaining data requests (i.e., Exhibit 7 items #1-5, 7-9) in September 2016 because the latter data will not be available until September 2016. The Board will report to the NCFMEA in an appropriate forum about the data received from RUSM (e.g., in response to the draft Staff Analysis or at the fall 2016 NCFMEA meeting) after it has been received.
Analyst Remarks to Narrative
In the Fall 2015 final staff analysis, the country was requested to provide additional information as to how the DMB uses data it collects from RUSM, including information about how it analyzes the data to determine compliance with its standards for purposes of continued accreditation. As noted in the previous section and described in the country's narrative, the DMB has revised its overall database document and has also been requiring RUSM to provide additional supplemental data in response to concerns identified as a result of the 2013 focused site visit to RUSM (Exs. 6 and 7). The DMB has compiled the information provided in response to the focused site visit (Ex. 7) into a longitudinal data assessment for 2013, 2014, and 2015. (Ex. 8). The country reports that it will have the 2016 numbers available for discussion by the time of the September 2016 NCFMEA meeting. Although the country has amply demonstrated that it is collecting and analyzing data from RUSM, the country has not provided any information as to how the data that the DMB is collecting, through either its overall database or from the supplemental database that resulted from RUSM's 2013 focused on-site visit, is actually being used by the agency. It is not clear to ED staff how the information resulting from the data collection effort is being applied to determine compliance, or noncompliance, with the DMB standards, what thresholds/benchmarks, if any, may have been established, or whether there are consequences to RUSM for not meeting any established thresholds/benchmarks. Additional information is therefore still requested regarding how the DMB applies the data it collects from RUSM to determine compliance, or noncompliance, with its standards for purposes of continued accreditation. Additional information is requested. The country is requested to provide information and supporting documentation that demonstrates how the information resulting from the data that the DMB collects and analyzes is applied to determine compliance with the DMB standards.
As an initial matter, we wish to clarify the nature of the comprehensive database document, which as reported to the NCMFEA the DMB recently revised, and the periodic data requests. The comprehensive database document, Exhibit 6 to the Original Narrative, is used for reaccreditation reviews. The revised database document is being used in connection with RUSM’s reaccreditation review, scheduled for 2017. The periodic data requests are more in the nature of annual reports and were initiated to allow the DMB to compare RUSM’s performance on specific measures from year to year. The data also has been used to assess RUSM’s progress with respect to the “2012 Standards to Monitor.” The ongoing collection and analysis of data from RUSM provides the DMB and its site visitors with comprehensive longitudinal information that allows the DMB to monitor RUSM’s progress with respect to various metrics, particularly with respect to standards that have been designated “in compliance with a need for monitoring” (the “2012 Standards to Monitor”). As a matter of course, the collection and evaluation of data supplements but does not replace the careful holistic assessment of RUSM’s compliance with various DMB standards by experienced and skilled site visitors. For example, in the several years since the 2012 reaccreditation review, the DMB and its site visitors have found it valuable to monitor in an ongoing fashion, sometimes as often as twice each year, RUSM’s progress in implementing new policies and approaches designed to improve RUSM’s performance in several areas, including a) the comparability of clinical clerkship sites, b) student achievement on NBME Subject Examinations and USMLE Step examinations, c) student residency attainment, and d) attrition rates. The regular collection and analysis of this data submitted by RUSM has informed the site visitors’ recommendations to the DMB regarding RUSM’s performance with regard to the “2012 Standards to Monitor.” When conducting a focused evaluation of standards “in compliance with a need for monitoring,” such as the “2012 Standards to Monitor,” the DMB would be unlikely to find RUSM “in compliance” with regard to a standard for which relevant data indicated a negative, uncertain, or unpredictable trend over time. At this time, the DMB has not adopted rigid thresholds or benchmarks for judging data to determine compliance with its standards. However, when appropriate, the DMB and its site visitors may use a benchmark or goal to evaluate RUSM’s achievement. For example, in the DMB’s revised database document, Exhibit 6 to the Original Narrative, for each required clerkship, the DMB asks the medical school to explain any instance where the low end of the range of the percentage of students who logged each required patient encounter or required procedure is less than 80%, and to describe steps being taken to improve the student experience at relevant sites. Similarly, the site visitors reference the longitudinal data to make targeted requests for additional information, ask follow-up questions of RUSM faculty, officials, and students while conducting site visits, and make recommendations that lead ultimately to an accreditation action. In other words, although the DMB has not to date invoked rigid thresholds or benchmarks standing alone to make categorical determinations of compliance or noncompliance, the DMB reviews the data provided and uses the data, including longitudinal trends in such data, to assist the DMB in analyzing RUSM’s compliance with DMB standards. To date, the DMB has received some data from RUSM in response to the 2016 data request. See Exhibit 7 to the Original Narrative. As reported in the Original Narrative, following discussion between the DMB and RUSM, the Board agreed to accept RUSM’s 2016 data submission in two parts based on when such data would become available. As requested, RUSM supplied responses to certain data requests in July 2016; the DMB expects to receive responses to the remaining data requests in September 2016. Exhibit B provides information about selected data supplied to date. The DMB will report to the NCFMEA about additional data received from RUSM after it has been received.
Analyst Remarks to Response
In response to the draft staff analysis, the country clarified that its comprehensive database document is used to collect information for its comprehensive re-accreditation reviews, whereas its periodic data requests are annual reports. The data collection efforts provide longitudinal information and help to establish trends that are used in monitoring the RUSM. Although the DMB has not established benchmarks, the information collected may form the basis for requests for targeted information for review by the DMB and on-site review teams in analyzing the RUSM's ongoing compliance with the DMB's standards. As documentation, the country provided a blank copy of its database document (Ex. 6), which requires that information be reported covering several years. Section II: Educational Program requires the institution to report a number of quantitative indicators for a three-year period, including USMLE results. The country also provided selected numerical data for the RUSM (Ex. B), which indicates that, for the three-year period covering 2013-2015, USMLE pass rates were increasing and attrition rates were dropping. Staff accepts the country's narrative and supporting documentation, and no additional information is requested.
An analysis of how the residency match rate is determined.
Historically, RUSM has calculated “match rate” using all students who graduated in a given year who obtain a residency in any specialty, whether through the National Resident Matching Program (NRMP or Match) or outside the Match. Using that methodology, RUSM’s residency first-time “match” rate for graduates has increased, from 86.7% in the period July 1, 2013 to June 30, 2014, to 87.8% in the period July 1, 2014 to June 30, 2015. The National Resident Matching Program calculates for certain non-U.S. countries a “U.S. international medical graduate (U.S. IMG) match status” that includes for a given year all U.S.-citizen students who attended medical school in the relevant country and who entered the Match regardless of when they graduated or whether they previously entered the Match. In such calculation, the NRMP also does not include those who obtain a position in the Supplemental Offer and Acceptance Program (SOAP), and it only counts a “successful match” if the student matches in his/her first choice of specialty. This methodology produces a U.S. IMG match status rate for Dominica of about 54.8%. See Exhibit 9 (NRMP, Charting Outcomes in the Match: International Medical Graduates (Jan. 2014) at pg. iii (description of methodology) and 22 (results)). The 2015 focused site visit team notes that success in the Match for U.S. students is dramatically lower for those who failed to match in the first attempt. In the site visitors’ experience, all U.S. medical schools report their match rate based on those who obtained a position, whether in the first or another specialty choice, including those who obtain a position through the SOAP, and count those who obtain a PGY-1 or preliminary year only. RUSM’s method is therefore similar to the method used by U.S. medical schools while the NRMP method is not and results in a lower rate. To avoid confusion, RUSM is now publicly reporting the calculated rate as a “Residency Attainment Rate” rather than “Match Rate” and in reports to the Board is referring to the rate as a “Residency Attainment Rate”.
Analyst Remarks to Narrative
At the Fall 2015 meeting, the Committee requested that the country provide information related to the analysis of how RUSM's residency match rate is determined. In response to the request for additional information from the Fall 2015 meeting, the country states that RUSM has historically calculated the match rate using all students who graduated in a given year who obtain a residency in any specialty, whether inside or outside the match. Using this methodology, RUSM's first-time residency match rate for graduates has increased from 80.9% in 2012-2013 to 87.8% in 2014-2015 (Ex. 8, p. 14). The country notes that the NRMP calculates its match rates based upon whether students obtain matches in their first choice of specialty. Using this more limiting methodology, the U.S. IMG match rate for Dominica is approximately 54.8%, according to information provided by the ECFMG for 2013 (Ex. 9, p. 22). Staff accepts the country's narrative and supporting documentation, and no additional information is requested.