Christine Greenhow, Julia Sonnevend, and Colin Agur (eds)
- Published in print:
- 2016
- Published Online:
- January 2017
- ISBN:
- 9780262034470
- eISBN:
- 9780262334853
- Item type:
- book
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262034470.001.0001
- Subject:
- Society and Culture, Media Studies
The past ten years have brought significant growth in access to Web technology and in the educational possibilities of social media. These changes challenge previous conceptualizations of education ...
More
The past ten years have brought significant growth in access to Web technology and in the educational possibilities of social media. These changes challenge previous conceptualizations of education and the classroom, and pose practical questions for learners, teachers, and administrators. Today, the unique capabilities of social media are influencing learning and teaching in ways previously unseen. Social media is transforming sectors outside education by changing patterns in personal, commercial, and cultural interaction. These changes offer a window into the future(s) of education, with new means of knowledge production and reception, and new roles for learners and teachers. Surveying the uses to which social media has been applied in these early years, we see a need to re-envision education for the coming decades. To date, no book has systematically and accessibly examined how the cultural and technological shift of social media is influencing educational practices. With this book, we aim to fill that gap. This book critically explores the future of education and online social media, convening leading scholars from the fields of education, law, communications, and cultural studies. We believe that this interdisciplinary edited volume will appeal to a broad audience of scholars, practitioners, and policy makers who seek to understand the opportunities for learning and education that exist at the intersection of social media and education. The book will examine educational institutions, access and participation, new literacies and competencies, cultural reproduction, international accreditation, intellectual property, privacy and protection, new business models, and technical architectures for digital education.Less
The past ten years have brought significant growth in access to Web technology and in the educational possibilities of social media. These changes challenge previous conceptualizations of education and the classroom, and pose practical questions for learners, teachers, and administrators. Today, the unique capabilities of social media are influencing learning and teaching in ways previously unseen. Social media is transforming sectors outside education by changing patterns in personal, commercial, and cultural interaction. These changes offer a window into the future(s) of education, with new means of knowledge production and reception, and new roles for learners and teachers. Surveying the uses to which social media has been applied in these early years, we see a need to re-envision education for the coming decades. To date, no book has systematically and accessibly examined how the cultural and technological shift of social media is influencing educational practices. With this book, we aim to fill that gap. This book critically explores the future of education and online social media, convening leading scholars from the fields of education, law, communications, and cultural studies. We believe that this interdisciplinary edited volume will appeal to a broad audience of scholars, practitioners, and policy makers who seek to understand the opportunities for learning and education that exist at the intersection of social media and education. The book will examine educational institutions, access and participation, new literacies and competencies, cultural reproduction, international accreditation, intellectual property, privacy and protection, new business models, and technical architectures for digital education.
Steven Brint and Jerome Karabel
- Published in print:
- 1989
- Published Online:
- November 2020
- ISBN:
- 9780195048155
- eISBN:
- 9780197560044
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780195048155.003.0006
- Subject:
- Education, Organization and Management of Education
Of all the changes in American higher education in the twentieth century, none has had a greater impact than the rise of the two-year, junior college. Yet ...
More
Of all the changes in American higher education in the twentieth century, none has had a greater impact than the rise of the two-year, junior college. Yet this institution, which we now take for granted, was once a radical organizational innovation. Stepping into an educational landscape already populated by hundreds of four-year colleges, the junior college was able to establish itself as a new type of institution—a nonbachelor’s degree-granting college that typically offered both college preparatory and terminal vocational programs. The junior college moved rapidly from a position of marginality to one of prominence; in the twenty years between 1919 and 1939, enrollment at junior colleges rose from 8,102 students to 149,854 (U.S. Office of Education 1944, p. 6). Thus, on the eve of World War II, an institution whose very survival had been in question just three decades earlier had become a key component of America’s system of higher education. The institutionalization and growth of what was a novel organizational form could not have taken place without the support and encouragement of powerful sponsors. Prominent among them were some of the nation’s greatest universities—among them, Chicago, Stanford, Michigan, and Berkeley—which, far from opposing the rise of the junior college as a potential competitor for students and resources, enthusiastically supported its growth. Because this support had a profound effect on the subsequent development of the junior college, we shall examine its philosophical and institutional foundations. In the late nineteenth century, an elite reform movement swept through the leading American universities. Beginning with Henry Tappan at the University of Michigan in the early 1850s and extending after the 1870s to Nicholas Murray Butler at Columbia, David Starr Jordan at Stanford, and William Rainey Harper at Chicago, one leading university president after another began to view the first two years of college as an unnecessary part of university-level instruction.
Less
Of all the changes in American higher education in the twentieth century, none has had a greater impact than the rise of the two-year, junior college. Yet this institution, which we now take for granted, was once a radical organizational innovation. Stepping into an educational landscape already populated by hundreds of four-year colleges, the junior college was able to establish itself as a new type of institution—a nonbachelor’s degree-granting college that typically offered both college preparatory and terminal vocational programs. The junior college moved rapidly from a position of marginality to one of prominence; in the twenty years between 1919 and 1939, enrollment at junior colleges rose from 8,102 students to 149,854 (U.S. Office of Education 1944, p. 6). Thus, on the eve of World War II, an institution whose very survival had been in question just three decades earlier had become a key component of America’s system of higher education. The institutionalization and growth of what was a novel organizational form could not have taken place without the support and encouragement of powerful sponsors. Prominent among them were some of the nation’s greatest universities—among them, Chicago, Stanford, Michigan, and Berkeley—which, far from opposing the rise of the junior college as a potential competitor for students and resources, enthusiastically supported its growth. Because this support had a profound effect on the subsequent development of the junior college, we shall examine its philosophical and institutional foundations. In the late nineteenth century, an elite reform movement swept through the leading American universities. Beginning with Henry Tappan at the University of Michigan in the early 1850s and extending after the 1870s to Nicholas Murray Butler at Columbia, David Starr Jordan at Stanford, and William Rainey Harper at Chicago, one leading university president after another began to view the first two years of college as an unnecessary part of university-level instruction.
Richard M. Freeland
- Published in print:
- 1992
- Published Online:
- November 2020
- ISBN:
- 9780195054644
- eISBN:
- 9780197560082
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780195054644.003.0005
- Subject:
- Education, History of Education
This book began as an exploration of a paradox in the history of American universities. In the twenty-five years following World War II, the student population served ...
More
This book began as an exploration of a paradox in the history of American universities. In the twenty-five years following World War II, the student population served by these institutions became more diverse and the societal purposes they served became more varied. Yet, during the same period, universities themselves became more alike. The contradictions were easy to observe. It was obvious that the academic and social backgrounds of students—and consequently their needs, skills, and interests—became more heterogeneous in the postwar years, yet the undergraduate curricula of universities increasingly stressed highly academic subjects, especially the arts and sciences. Similarly, universities pursued a well-documented trend toward greater involvement in practical affairs and social problem solving in the 1950s and 1960s, while also adhering to a narrowing focus on doctoral programs and research in the basic disciplines. I wanted to understand the forces, both internal and external to campuses, that promoted this puzzling conjunction of converging characteristics and expanding functions. I also wanted to assess the academic and social consequences of this pattern. The decline of institutional diversity was only the most startling of a number of apparently inconsistent developments associated with an era of historic growth among universities. Almost as curious was the fact that, while expansion occurred mostly to accommodate increased demand for college education, institutional attention to teaching diminished, as did concern about the undergraduate curriculum. Meanwhile, graduate programs, whose chief function was to train college teachers, tended to slight preparation for instructional work and to nurture research skills. Indeed, as growth intensified academia’s role in socializing the nation’s youth, universities dismantled the programs of general education that were the primary vehicles they had created for that purpose. More broadly, the active involvement of universities in the definition and resolution of social problems went hand in hand with the consolidation of an academic value system quite remote from most Americans. Even the increasing heterogeneity of the student population was not free of contradiction. Academic leaders claimed credit for making their institutions more democratic during the postwar years by reducing traditional barriers to admission—including those of income and race.
Less
This book began as an exploration of a paradox in the history of American universities. In the twenty-five years following World War II, the student population served by these institutions became more diverse and the societal purposes they served became more varied. Yet, during the same period, universities themselves became more alike. The contradictions were easy to observe. It was obvious that the academic and social backgrounds of students—and consequently their needs, skills, and interests—became more heterogeneous in the postwar years, yet the undergraduate curricula of universities increasingly stressed highly academic subjects, especially the arts and sciences. Similarly, universities pursued a well-documented trend toward greater involvement in practical affairs and social problem solving in the 1950s and 1960s, while also adhering to a narrowing focus on doctoral programs and research in the basic disciplines. I wanted to understand the forces, both internal and external to campuses, that promoted this puzzling conjunction of converging characteristics and expanding functions. I also wanted to assess the academic and social consequences of this pattern. The decline of institutional diversity was only the most startling of a number of apparently inconsistent developments associated with an era of historic growth among universities. Almost as curious was the fact that, while expansion occurred mostly to accommodate increased demand for college education, institutional attention to teaching diminished, as did concern about the undergraduate curriculum. Meanwhile, graduate programs, whose chief function was to train college teachers, tended to slight preparation for instructional work and to nurture research skills. Indeed, as growth intensified academia’s role in socializing the nation’s youth, universities dismantled the programs of general education that were the primary vehicles they had created for that purpose. More broadly, the active involvement of universities in the definition and resolution of social problems went hand in hand with the consolidation of an academic value system quite remote from most Americans. Even the increasing heterogeneity of the student population was not free of contradiction. Academic leaders claimed credit for making their institutions more democratic during the postwar years by reducing traditional barriers to admission—including those of income and race.
Richard M. Freeland
- Published in print:
- 1992
- Published Online:
- November 2020
- ISBN:
- 9780195054644
- eISBN:
- 9780197560082
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780195054644.003.0008
- Subject:
- Education, History of Education
In the years following World War II, academic leaders in Massachusetts participated in a national debate about the social role of higher education in the ...
More
In the years following World War II, academic leaders in Massachusetts participated in a national debate about the social role of higher education in the era that lay ahead. They also experienced the beginnings of a period of expansion for universities that would continue, more or less uninterrupted, for twenty-five years. Change in this postwar golden age involved an ongoing interaction between ideas and opportunities: the first concerning the public purposes of higher education; the second promising glory for institutions and advancement for academic interest groups. For most of the period, the dominant view—inside and outside of higher education—was that expansion was improving the academy as well as the country, but the turmoil of the late 1960s raised fundamental doubts about the character of postwar change. Although World War II entailed difficulties for universities, their extensive involvement in the military effort stirred a new awareness of the social importance of academic work. This habit of thought extended into the postwar period, as educators, exhilarated by wartime patriotism, looked for new ways to contribute to social problem solving. As they did so, they exhibited a further effect of their recent experience: a tendency to focus on national concerns—as distinct from regional or local ones—far more intensively than they had done before 1940. The country's agenda was long. The human costs of the war, and the even more-frightening possibility of atomic conflict, made the importance of maintaining peace evident. Europe had precipitated two wars in a generation and now lay in ruins. The United States, suddenly the preeminent power of the globe, would have to pioneer in shaping a stable world order. In some, the nation's new international prominence aroused a sense of urgency about discrimination and inequality at home. More broadly, world leadership implied a need to maintain military and economic power and the technological vitality on which they depended. Many educators believed they had important roles to play in all these contexts—through training leaders, forming attitudes, and advancing knowledge. As one college president put it: “Events... have shaken the complacency of many university communities and compelled educators to... make [their] maximum contribution to a decent, well-ordered, free and peaceful society.”
Less
In the years following World War II, academic leaders in Massachusetts participated in a national debate about the social role of higher education in the era that lay ahead. They also experienced the beginnings of a period of expansion for universities that would continue, more or less uninterrupted, for twenty-five years. Change in this postwar golden age involved an ongoing interaction between ideas and opportunities: the first concerning the public purposes of higher education; the second promising glory for institutions and advancement for academic interest groups. For most of the period, the dominant view—inside and outside of higher education—was that expansion was improving the academy as well as the country, but the turmoil of the late 1960s raised fundamental doubts about the character of postwar change. Although World War II entailed difficulties for universities, their extensive involvement in the military effort stirred a new awareness of the social importance of academic work. This habit of thought extended into the postwar period, as educators, exhilarated by wartime patriotism, looked for new ways to contribute to social problem solving. As they did so, they exhibited a further effect of their recent experience: a tendency to focus on national concerns—as distinct from regional or local ones—far more intensively than they had done before 1940. The country's agenda was long. The human costs of the war, and the even more-frightening possibility of atomic conflict, made the importance of maintaining peace evident. Europe had precipitated two wars in a generation and now lay in ruins. The United States, suddenly the preeminent power of the globe, would have to pioneer in shaping a stable world order. In some, the nation's new international prominence aroused a sense of urgency about discrimination and inequality at home. More broadly, world leadership implied a need to maintain military and economic power and the technological vitality on which they depended. Many educators believed they had important roles to play in all these contexts—through training leaders, forming attitudes, and advancing knowledge. As one college president put it: “Events... have shaken the complacency of many university communities and compelled educators to... make [their] maximum contribution to a decent, well-ordered, free and peaceful society.”
Richard M. Freeland
- Published in print:
- 1992
- Published Online:
- November 2020
- ISBN:
- 9780195054644
- eISBN:
- 9780197560082
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780195054644.003.0011
- Subject:
- Education, History of Education
Tufts College, traditionally focused on undergraduate education in the arts and sciences, responded to the opportunities of the postwar years with new ...
More
Tufts College, traditionally focused on undergraduate education in the arts and sciences, responded to the opportunities of the postwar years with new emphases on research and doctoral-level programs. A new name, “Tufts University,” signified the change. The leaders of Tufts intended, however, to retain a primary emphasis on undergraduate work. During these same years, a new university, Brandeis, sponsored by a group of American Jews, joined the state’s academic community. Brandeis’s founders also conceived their institution as centrally concerned with undergraduate education, although they too intended to build a modest array of graduate programs, especially in the arts and sciences. In projecting their development during the 1950s and 1960s, Tufts and Brandeis set out to become different versions of a distinctive institutional idea: the college-centered university. By the early 1940s, President Leonard Carmichael of Tufts, like his counterparts at Harvard and M.I.T., had come to regard World War II as a time of opportunity, despite immediate, war-related problems of enrollment and finance. Carmichael’s wartime reports referred repeatedly to new possibilities arising from the military emergency. He welcomed a Navy R.O.T.C. unit to Medford as a chance for greater visibility as well as for public service. He speculated that increased awareness of international issues would benefit the Fletcher School. Most important of all, given Tufts’s history of straightened finances, was the possibility of new federal support. “It is ... not too early,” Carmichael told his trustees in the middle of the war, “for all of us to do what we can to see to it that the men who administer our postwar education [at the federal level]... have an appreciation of the importance to this nation of colleges and universities with varied objectives and varied bases of administration and support.” If federal funds were to become available, Carmichael wanted to be sure that private institutions got their share, and he assured his board that “every effort is being made to maintain our relationships with the armed services... so that Tufts’s peculiar qualities—a university-college in which teaching and research go forward together—may be maintained ...”
Less
Tufts College, traditionally focused on undergraduate education in the arts and sciences, responded to the opportunities of the postwar years with new emphases on research and doctoral-level programs. A new name, “Tufts University,” signified the change. The leaders of Tufts intended, however, to retain a primary emphasis on undergraduate work. During these same years, a new university, Brandeis, sponsored by a group of American Jews, joined the state’s academic community. Brandeis’s founders also conceived their institution as centrally concerned with undergraduate education, although they too intended to build a modest array of graduate programs, especially in the arts and sciences. In projecting their development during the 1950s and 1960s, Tufts and Brandeis set out to become different versions of a distinctive institutional idea: the college-centered university. By the early 1940s, President Leonard Carmichael of Tufts, like his counterparts at Harvard and M.I.T., had come to regard World War II as a time of opportunity, despite immediate, war-related problems of enrollment and finance. Carmichael’s wartime reports referred repeatedly to new possibilities arising from the military emergency. He welcomed a Navy R.O.T.C. unit to Medford as a chance for greater visibility as well as for public service. He speculated that increased awareness of international issues would benefit the Fletcher School. Most important of all, given Tufts’s history of straightened finances, was the possibility of new federal support. “It is ... not too early,” Carmichael told his trustees in the middle of the war, “for all of us to do what we can to see to it that the men who administer our postwar education [at the federal level]... have an appreciation of the importance to this nation of colleges and universities with varied objectives and varied bases of administration and support.” If federal funds were to become available, Carmichael wanted to be sure that private institutions got their share, and he assured his board that “every effort is being made to maintain our relationships with the armed services... so that Tufts’s peculiar qualities—a university-college in which teaching and research go forward together—may be maintained ...”
Steven Brint and Jerome Karabel
- Published in print:
- 1989
- Published Online:
- November 2020
- ISBN:
- 9780195048155
- eISBN:
- 9780197560044
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780195048155.003.0014
- Subject:
- Education, Organization and Management of Education
Since its origins at the turn of the century, the junior college has had a complex, and at times uneasy, relationship with a public that has looked to the ...
More
Since its origins at the turn of the century, the junior college has had a complex, and at times uneasy, relationship with a public that has looked to the educational system as a vehicle for the realization of the American dream. Despite its self-portrayal as “democracy’s college” and its often heroic efforts to extend education to the masses, the two-year institution has faced widespread public skepticism. For to most Americans, college was a pathway to the bachelor’s degree, and the junior college—unlike the four-year institution—could not award it. Moreover, the early public junior colleges were often tied administratively and even physically to local secondary schools, a pattern that compounded their problems in gaining legitimacy as bona fide institutions of higher education. The two-year institution’s claim to being a genuine college rested almost exclusively on its promise to offer the first two years of a four-year college education. Yet the junior college was never intended, despite the high aspirations of its students, to provide anything more than a terminal education for most of those who entered it; indeed, at no point in its history did even half of its students transfer to a four-year institution. Nonetheless, for at least the first two decades of its existence, almost exclusive emphasis was placed on its transfer rather than its terminal function. As the early leaders of the movement saw it, the first task at hand was to establish the legitimacy of this fragile institution as an authentic college. And this task could be accomplished only by convincing the existing four-year institutions to admit junior college graduates and to offer them credit for the courses that they had completed there. If the pursuit of academic respectability through emphasis on transfer dominated the junior college movement during its first decades, by the mid-1920s a countermovement stressing the role of the junior college as a provider of terminal vocational education began to gather momentum. Arguing that most junior college students were, whatever their aspirations, in fact terminal, proponents of this view saw the institution’s main task not as providing a platform for transfer for a minority but, rather, as offering vocational programs leading to marketable skills for the vast majority.
Less
Since its origins at the turn of the century, the junior college has had a complex, and at times uneasy, relationship with a public that has looked to the educational system as a vehicle for the realization of the American dream. Despite its self-portrayal as “democracy’s college” and its often heroic efforts to extend education to the masses, the two-year institution has faced widespread public skepticism. For to most Americans, college was a pathway to the bachelor’s degree, and the junior college—unlike the four-year institution—could not award it. Moreover, the early public junior colleges were often tied administratively and even physically to local secondary schools, a pattern that compounded their problems in gaining legitimacy as bona fide institutions of higher education. The two-year institution’s claim to being a genuine college rested almost exclusively on its promise to offer the first two years of a four-year college education. Yet the junior college was never intended, despite the high aspirations of its students, to provide anything more than a terminal education for most of those who entered it; indeed, at no point in its history did even half of its students transfer to a four-year institution. Nonetheless, for at least the first two decades of its existence, almost exclusive emphasis was placed on its transfer rather than its terminal function. As the early leaders of the movement saw it, the first task at hand was to establish the legitimacy of this fragile institution as an authentic college. And this task could be accomplished only by convincing the existing four-year institutions to admit junior college graduates and to offer them credit for the courses that they had completed there. If the pursuit of academic respectability through emphasis on transfer dominated the junior college movement during its first decades, by the mid-1920s a countermovement stressing the role of the junior college as a provider of terminal vocational education began to gather momentum. Arguing that most junior college students were, whatever their aspirations, in fact terminal, proponents of this view saw the institution’s main task not as providing a platform for transfer for a minority but, rather, as offering vocational programs leading to marketable skills for the vast majority.
Anthony R. Oliver
- Published in print:
- 2019
- Published Online:
- November 2020
- ISBN:
- 9780198801740
- eISBN:
- 9780191917158
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198801740.003.0019
- Subject:
- Clinical Medicine and Allied Health, Professional Development in Medicine
According to the International Organization for Standardization (ISO), the ‘Medical laboratories—Requirements for quality and competence (ISO 15189:2012) ...
More
According to the International Organization for Standardization (ISO), the ‘Medical laboratories—Requirements for quality and competence (ISO 15189:2012) BS EN ISO 15189:2012’ accreditation is defined as ‘a procedure by which an authoritative body gives formal recognition that an organization is competent to carry out specific tasks’. Accreditation is delivered by the ‘competent authority’ based on a set of defined standards and the continual internal audit of the laboratory processes and infrastructure against these standards to achieve conformance. Additionally, the ‘competent authority’ periodically undertakes assessments to ensure compliance with the standards. These assessments vary in frequency and nature depending upon the assessment body. In some instances (e.g. UK Accreditation Service, UKAS), the assessments are annual and based on a four- year cycle covering the whole laboratory repertoire and infrastructure. The HSE is responsible for the inspection and licencing of microbiological containment level 3 and 4 facilities. The HTA is responsible for legal registration of laboratories that process and store human tissue, and is mainly histology related. The MHRA provides guidelines on good laboratory practice, good clinical practice, good clinical laboratory practice, and good manufacturing practice, largely around clinical trial work. It is also responsible for accreditation of blood transfusion laboratories. Finally, it provides guidance on the In Vitro Diagnostic Medical Device Directive (IVDMDD, 98/ 79/ EC) and the regulation of medical ‘devices’ including diagnostic devices, where a ‘device’ is defined as including reagent kits and analytical platforms. EFI provides guidance and standards for transplantation and tissue typing laboratories across Europe. Until 2009, CPA provided accreditation for the majority of UK pathology services. CPA was acquired by the UK Accreditation Service in 2009. UKAS is a government- appointed national accreditation body for the UK that is responsible for certification, testing, inspection, and calibration services, and is the competent authority for all ISO standards, not just pathology. It covers various sectors, including healthcare, food production, energy supply, climate change, and personal safety. The majority of UK pathology services will be UKAS ISO15189 accredited by 2018, including transitional ‘dual’ CPA standards/ ISO15189 accreditation between 2015 and 2018. It also provides ISO22870:2006 accreditation that is point of care specific, as well as ISO17025:2005, which applies to calibration standards.
Less
According to the International Organization for Standardization (ISO), the ‘Medical laboratories—Requirements for quality and competence (ISO 15189:2012) BS EN ISO 15189:2012’ accreditation is defined as ‘a procedure by which an authoritative body gives formal recognition that an organization is competent to carry out specific tasks’. Accreditation is delivered by the ‘competent authority’ based on a set of defined standards and the continual internal audit of the laboratory processes and infrastructure against these standards to achieve conformance. Additionally, the ‘competent authority’ periodically undertakes assessments to ensure compliance with the standards. These assessments vary in frequency and nature depending upon the assessment body. In some instances (e.g. UK Accreditation Service, UKAS), the assessments are annual and based on a four- year cycle covering the whole laboratory repertoire and infrastructure. The HSE is responsible for the inspection and licencing of microbiological containment level 3 and 4 facilities. The HTA is responsible for legal registration of laboratories that process and store human tissue, and is mainly histology related. The MHRA provides guidelines on good laboratory practice, good clinical practice, good clinical laboratory practice, and good manufacturing practice, largely around clinical trial work. It is also responsible for accreditation of blood transfusion laboratories. Finally, it provides guidance on the In Vitro Diagnostic Medical Device Directive (IVDMDD, 98/ 79/ EC) and the regulation of medical ‘devices’ including diagnostic devices, where a ‘device’ is defined as including reagent kits and analytical platforms. EFI provides guidance and standards for transplantation and tissue typing laboratories across Europe. Until 2009, CPA provided accreditation for the majority of UK pathology services. CPA was acquired by the UK Accreditation Service in 2009. UKAS is a government- appointed national accreditation body for the UK that is responsible for certification, testing, inspection, and calibration services, and is the competent authority for all ISO standards, not just pathology. It covers various sectors, including healthcare, food production, energy supply, climate change, and personal safety. The majority of UK pathology services will be UKAS ISO15189 accredited by 2018, including transitional ‘dual’ CPA standards/ ISO15189 accreditation between 2015 and 2018. It also provides ISO22870:2006 accreditation that is point of care specific, as well as ISO17025:2005, which applies to calibration standards.
Teri Cannon
- Published in print:
- 2017
- Published Online:
- May 2018
- ISBN:
- 9780262037150
- eISBN:
- 9780262343695
- Item type:
- chapter
- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262037150.003.0026
- Subject:
- Education, Educational Policy and Politics
American accrediting agencies have been under increasing pressure from the government, employers, and other policy makers. These agencies are being asked to hold accredited educational institutions ...
More
American accrediting agencies have been under increasing pressure from the government, employers, and other policy makers. These agencies are being asked to hold accredited educational institutions accountable for student learning outcomes, on-time retention and completion, and other key indicators of institutional and student success. At the same time, accreditors are often accused of stifling innovation in education with unnecessarily restrictive policies, bureaucratic and burdensome procedures, and a peer review process that is biased against new ideas and entrants into the sector. We faced these dynamics in seeking approval for Minerva to affiliate with the Keck Graduate Institute and to offer its programs in a delivery modality that had never been seen before. The process required us to build support for innovation while demonstrating the evidence-based foundation for our curriculum and teaching methods and to balance the new with generally accepted and traditional indicators of quality.Less
American accrediting agencies have been under increasing pressure from the government, employers, and other policy makers. These agencies are being asked to hold accredited educational institutions accountable for student learning outcomes, on-time retention and completion, and other key indicators of institutional and student success. At the same time, accreditors are often accused of stifling innovation in education with unnecessarily restrictive policies, bureaucratic and burdensome procedures, and a peer review process that is biased against new ideas and entrants into the sector. We faced these dynamics in seeking approval for Minerva to affiliate with the Keck Graduate Institute and to offer its programs in a delivery modality that had never been seen before. The process required us to build support for innovation while demonstrating the evidence-based foundation for our curriculum and teaching methods and to balance the new with generally accepted and traditional indicators of quality.
Julee T. Flood and Terry L. Leap
- Published in print:
- 2018
- Published Online:
- May 2019
- ISBN:
- 9781501728952
- eISBN:
- 9781501728969
- Item type:
- chapter
- Publisher:
- Cornell University Press
- DOI:
- 10.7591/cornell/9781501728952.003.0001
- Subject:
- Education, Higher and Further Education
The wide range of U.S. institutions of higher learning face a number of challenges such as the balance between faculty research, teaching, and service expectations, the high and, sometimes, ...
More
The wide range of U.S. institutions of higher learning face a number of challenges such as the balance between faculty research, teaching, and service expectations, the high and, sometimes, prohibitive cost of a college education, eroding academic standards, heated debates over curriculum issues, administrative bloat, and the contentious nature of promotion and tenure decisions. Risk management is often the key to dealing with these issues.Less
The wide range of U.S. institutions of higher learning face a number of challenges such as the balance between faculty research, teaching, and service expectations, the high and, sometimes, prohibitive cost of a college education, eroding academic standards, heated debates over curriculum issues, administrative bloat, and the contentious nature of promotion and tenure decisions. Risk management is often the key to dealing with these issues.