The Voluntary Sector in Canada A number of university continuing education divisions have been approached in recent years by non-profit, voluntary organizations with requests that they provide educational programs which would teach management skills or professional practice. Target audiences for these programs might be line staff, middle and senior managers, or voluntary board members. Usually those making the request say they are eager for the program but admit they really haven’t sufficient money in the budget to pay for it since the voluntary sector is in the middle of a funding and evaluation “squeeze”. Donations and government grants are both more difficult to obtain yet, concurrently, there is greater public and governmental demand for accountability in the sector which is increasing its desire for greater management and professional training.
The University of Manitoba and the Voluntary Sector The University ofManitoba’s Continuing Education Division was among those which have been approached in recent years and asked to provide educational and training services to human service, sports/recreational and arts/multicultural organizations. The university undertook this challenge through Project Manage, a four-year demonstration program co-funded by the Continuing Education Division and The Winnipeg Foundation.
Project Manage began in January, 1981, after nearly two years of discussion among a variety of community organizations which were concerned about the level of managerial skills exhibited by voluntary boards and senior staff of community organizations. The organizations which participated in these initial discussions and later sat on an advisory committee to Project Manage were: The Winnipeg Foundation, Social Planning Council of Winnipeg, Secretary of State, The United Way, The Volunteer Centre of Winnipeg, Manitoba Arts Council, Manitoba Sports Federation, the Junior League ofWinnipeg, and The University of Manitoba.
I was its founding Executive Director.
• This article was developed and updated by Professor Wolf from a paper presented at the 1984 CAUCE Conference, London, Ontario.This advisory committee continued to be quite active throughout the life of the project, meeting on a regular basis and participating in committee work concerned with program development, marketing, and program/administrative evaluation.
During the four-year demonstration period, about 5,500 people participated in Project Manage programs-approximately 1,200 in the first program year, about 1,800 in each of program years two and three and the balance in the final year.
This paper is intended to describe the University ofManitoba’s experience with Project Manage, to assess its outcomes, and to provide recommendations and cautionary road signs for others who might choose to travel this path.
Description of Project Activities and Implementation
During its first four months, Project Manage offered no programs, but concentrated on assessing needs among human service, sports/recreational and arts/multi-cultural groups. This assessment was combined with the development of a database on Manitoba non-profits because no demographic or organizational performance profiles were available from other sources (a North America-wide problem, not restricted to Manitoba). A comprehensive needs survey was distributed to more than 1,000 people in more than 300 provincial non-profit organizations. Eventually-through mailbacks, telephone and personal interviews—a database was compiled for 220 organizations. It included both demographic information and a variety of indicators of organizational effectiveness.1
The survey indicated that upwards of 50 per cent of the sample required significant upgrading in skills related to policy administration, finance, public and community relations, and program evaluation. Particularly inadequate areas were board training and development, planning and evaluation skills. This information formed the basis for the work of the program development committee for at least the first year of the project. The results of the needs survey validated the initial mission statement for Project Manage: “To improve, through administrative training and supports to board members and senior staff, the organizational effectiveness of voluntary, non-profit organizations.”
It was recognized from the beginning of the project that non-profits did not have the financial resources to pay the full cost of training. Very few non-profits budget as much as one per cent for staff training and development, and almost none budget any amount for board training and development. Consequently, the program offerings were subsidized from the beginning, and with the full knowledge that they would always require some subsidy. It was hoped that nonprofits could be encouraged during the demonstration period to increase their budgetary allocation for training, and that funders could be encouraged to be more generous in this area (further comment on this will be made later in this paper). The approximate funding breakdown for the first three years was:
The Winnipeg Foundation |
$120,000 |
Continuing Education Division |
$105,000 |
Program and Consultation Fees |
$135,000 |
To keep program fees as low as possible, the only full-time staff assigned to the project were one program director and one clerical support person. All other staffing was handled on a short-term, contract basis.
One of the first decisions of the advisory committee was that the offices for Project Manage should be located off campus. The University of Manitoba’s main campus is located in Fort Garry, in the far southern end of Winnipeg. Nearly all of the non-profits operate in the city’s central core area. Consequently, offices and instructional space were leased at 425 Elgin Avenue, in the city’s core.
This initial location decision was a significant factor in the Project’s subsequent success. Participants frequently commented that they would not have attended if they had had to fight traffic, parking problems, and a confusing campus layout to go to the University. The downtown location also seemed to ameliorate, for many participants, the psychological barrier of entering the university academic community. This was reinforced by the informal style of Project Manage workshops, and the extreme flexibility about hours for courses (any mutually convenient time). We were also extremely flexible, as the project progressed, about instructional locations-offering programs not only throughout the city of Winnipeg, but also in Selkirk, Brandon, Thompson, Flin Pion, Steinbach, Portage La Prairie, Waywayseecapo and Crane River, among others. We also conducted occasional programs in Northwestern Ontario.
Almost all of the promotion for Project Manage, during its four years, was done through mailed brochures. An introductory brochure was distributed to a variety of community mailing lists (Secretary of State, Social Planning Council of Winnipeg, United Way of Winnipeg, Manitoba Sports Federation, Manitoba Arts Council). This brochure included a return form which was used to build our own mailing list and eventually grew to more than 2,000 personal and organizational names. Considerable attention was given to the care of this mailing list, which came to be viewed as a community resource in itself and was used by other groups.
To keep overhead costs low, the Marketing Committee decided to use one primary brochure mailing a year, with a secondary insta-print-style brochure mailing in January as a program reminder. Therefore, the second- and thirdyear calendars were designed in wall-calendar format to encourage retention by clients. A subsequent marketing survey in year two indicated that almost 90 per cent of our clients still retained and referred to the calendar brochure seven months after distribution. By year three, the Marketing Committee also developed a stock brochure describing training programs for boards which could be mailed out upon request and distributed in larger volume at meetings and conferences.
During the first year, program fees for seminars and workshops were arbitrarily set at about$70 a day, with the intention of gradually increasing the fees in years two and three as organizations were, we hoped, concurrently educated to budget more for training and development. The fall calendar of year two reflects this policy of gradual increases, and indicates fees up to approximately $90 a day. However, at about the same time, the provincial government-in mid-budget year-imposed a one per cent payroll tax levy on all employers, including voluntary non-profits. Most organizations decided to pay the payroll levy out of their training and development budgets, rather than out of their program budgets. Consequently, in January of year two, we dropped fees to about $35 a day in response to numerous client-group requests. This did result in an upswing in seminar registration.
The first year of programming experience taught us several other important lessons. First we discovered that University business faculty members were unsuitable instructors for this audience. They tended to be too academically and corporate oriented, and they were accustomed to teaching in lecture or lecture-cum-seminar style. They also were not amenable to retraining to deal with this audience, because the stipends we were able to offer were not competitive. Therefore, we had to develop and train our own instructional resources. For staff development programs, this meant recruiting independent management consultants and trainers and orienting them to the needs and content required for this market. For board programs, this meant recruiting a small number of paid trainers/consultants who could do intensive work with boards, and also developing and training a corps of volunteer trainers who would do half-day or evening sessions with board groups.
As might be expected, this meant that we were not only involved in instructional training, but, very rapidly, in curriculum development. Very few curriculum materials, case studies, and exercises are developed for the non-profit market in Canada (though a number of groups are now beginning to produce these, they are of widely varying quality). We used the most broadly-developed packages of materials we could find—the United Way Volunteer Leadership Development Program-but experience quickly taught us that these packages also were too general and theoretical to meet the needs of most of our target groups. Therefore, with the assistance of small grants from the Employment Development Branch of Canada Employment and Immigration, and also from the Secretary of State, we developed a number of workshop/seminar training modules for boards: The Executive Director Search, Improving Boardsmanship Skills, Meeting Management, Legal and Fiscal Issues Affecting Volunteers, Financial Management for Smaller All-Voluntary Organizations, etc.
Also, after the first year’s experience, the advisory committee began to reflect upon its mission statement. The mandate for Project Manage had been to improve organizational effectiveness. Like training programs of any kind, most university continuing education programs are designed to improve individual effectiveness and knowledge. We began to examine whether the most appropriate methodology for improving organizational effectiveness was to offer general seminars/workshops which 15 people from 15 different organizations might attend and came to the conclusion that, except for those attending workshop/ seminars in accounting or data processing, very little improvement in organizational effectiveness would result from following this traditional educational model.
In year two, therefore, we began to put more emphasis on marketing in-house workshops for boards and senior staff. The concept was welcomed by our target groups, who quickly saw both the training and cost-effectiveness advantages of this approach, and by the end of year two approximately 40 per cent of Project Manage’s offerings were in-house workshops.
In year three, this percentage rose to about 90 per cent or more, and we further refined the training model. It could best be described as “consul/training”. Briefly, it worked like this: an organization would approach us with a perceived need for training or management information and a trainer/consultant would meet with representatives of the group to discuss the background to the situation and arrange for whatever needs assessment might be required. Frequently, this would involve such investigatory procedures as a questionnaire for all board and senior staff members. After studying the material gathered in the needs assessment, the trainer/consultant would develop a contract with the organization for one or more workshops with board and senior staff members and this might be combined with long-range planning and goal-setting. In the process of these exercises, another management concern or problem (or the need for some unknown data to be researched) might come up, and we would provide professional expertise to assist the organization in dealing with the problem.
The final step in the “consul-training” process was evaluation of the results. This was done both at the end of the process and, also, where possible, five or six months later. As described under Program Outcomes below, this program model proved very effective in encouraging organizational change. It was also an extremely effective means of keeping us in close touch with operational problems at the organizational level.
A graphic example of “consul-training” in action was Project Manage’s contract with the provincial government’s Child and Family Services Directorate. In the fall of 1983, the province took over the Children’s Aid Society of Winnipeg, as the first move in a reorganization of the entire delivery system for services to children and families in the metropolitan region. As a result, six new community child and family service agencies were developed in the city of Winnipeg. Each had a governing board of directors and a neighborhood resource centre, though some continued to be regionally shared.
Philosophically, the takeover represented a major change. Manitoba became the first province to pass legislation funding preventive services for children and families, as well as protection services for children. The provincial Directorate had a permanent staff of only seven, but it disbursed more than $45 million a
year in services to the affected region. To assist in the reorganization, the Social Planning Council of Winnipeg and Project Manage were asked to provide a number of special services. The services which Project Manage provided to the Directorate included:
• Acting as principal consultant to the Directorate in the development of a computerized critical-path implementation plan. This was the first time that the province had ever used a computerized CPM for any project other than hydro-electric development or telephone system construction. The program eventually included more than 500 planning activities which were co-ordinated through the computerized model;
• Advising on the organizational structure of the new agencies, the method and structure for election of board members, and the drafting of model bylaws for the new agencies;
• Identification of key policy issues in the reorganization and identification of resources for policy development;
• Training of new board members of the six agencies in general board skills. (The new boards included significant representation from native and ethnic groups and curriculum development included consideration of their special language and literacy requirements);
• Designing and facilitating the program for orienting new board members to revised systems, policies and procedures;
• Producing a training and orientation manual for new board members;
• Training executive- and director-selection committees of the new boards in interviewing skills and hiring techniques and assisting these groups to develop first-year performance-appraisal plans;
• Working with the new boards on a continuing basis to outline training and development plans for board members.
Project/Program Outcomes
To this point this paper has been concerned with identifying and discussing project outputs, i.e., the range of goods and services that Project Manage produced during its demonstration period. These included general registration events, in-house workshops and consulting services, production of curriculum materials, etc. “Outputs”, is a convenient way to classify goods and services supplied by an educational institution and received by (or directed to) a consumer group(s) and it is reasonably easy to be objective about them. People may dispute which ones are important, but agreement on what a particular output is, how it is measured, and what the quantities are (within a reasonable margin of error) should be easily reached.
We move onto shakier ground in the study of organizations and education when we attempt to discuss “outcomes”. The concept of outcomes includes a subjective element of judgment because it involves human preferences and questions such as: should outputs have been distributed in other ways or in different proportions? Are the consequences good or bad for different consumers? Should consumers (in this case, non-profit organizations) who were worse offhave been made better off as a result of these outputs? When “should” or “ought” appear, we move beyond facts into the realm of values.
It is stating the obvious to say that values are subject to change. In the case of Project Manage, the value expressed in the initial mission statement-to improve organizational effectiveness, resulted in a value change during the implementation period. Where possible in the discussion that follows, an attempt will be made to clarify judgment criteria, but this will be a very imprecise exercise. (For discussion purposes, we will use Thomas Dye’s practical definition of an outcome as “something at the end of a complicated process rather than at the beginning-an effect rather than a cause.”F A further caveat should preface this discussion of project/program outcomes. Ideally, it would have been nice to have a time stream of outcomes which would have been lapsed mechanically into a single measure, much like an economist’s procedure for discounting costs and benefits. This was not possible with the resources available to the project (if indeed it is possible at all). We therefore attempted a variety of “snapshot” approaches to assess outcomes, and have attempted to create a total panorama from these. We are very much aware of the limitations of this approach, but felt it was the one which best accounted for resource constraints and also for the large number of extraneous variables over which the project had no control. (These will be discussed below.)
It is always easiest to describe the demographics of participation, so we begin there. To review:
• more than 5,500 people participated in the four-year demonstration and programs and interventions were delivered at a variety of sites;
• approximately 64 per cent of participants were staff or board members of human service agencies;
• approximately 15 per cent were associated with sports or recreational associations;
• approximately 11 per cent were associated with day care centres;
• arts and multi-cultural organizations provided about five per cent each; and
• between 85 and 90 per cent of participants were residents of Winnipeg but many had province-wide responsibilities.
About 67 per cent of registrants were associated with organizations which had annual budgets offrom $50,000 to $1 million, with45 per cent of these in the $50,000-$150,000 category. In each of three major demographic surveys conducted by Project Manage (see below), there was a strong correlation between organization size and participation in Project Manage. For example,nine per cent of Project Manage’s clientele had operational budgets in excess of $1 million, compared with four per cent of the general agency population. Seventeen per cent of Project Manage’s human-service participants were in the $1-million-plus category, compared with seven per cent of human service organizations in the general population. Most organizations which used our services were also older than the average for the agency population and had some organizational experience, i.e., 42 per cent were six to 10 years old, 21 per cent were less than five years old, and 21 per cent were more than 25 years old.
This participation profile is particularly interesting in light of the market-penetration statistics for Project Manage. In a survey conducted on non-users in the summer of 1983, 92 per cent of non-users were found to be aware of the existence of Project Manage and could identify most of its services, and 80 per cent were also aware that Project Manage was a service of the Continuing Education Division of the University of Manitoba. While our earlier surveys had indicated that users and non-users did not differ in their management- and board-training needs, 82 per cent of non-users reported that they did not make use of Project Manage services because of financial constraints and insufficient time to take courses. Only eight per cent reported that Project Manage courses did not fulfill any of their needs.
Price, then, was considered to be a major barrier to access-not an infrequent story in continuing education-even though our fees were comparatively low. This was particularly true during the last 30 months of the demonstration period, when the imposition of the payroll levy and accompanying stringent cutbacks in provincial funding of human services slashed training and development budgets to the bone. During program years two, three and four, we noted a strong increase in the number of registrants who were paying fees out of their own pockets. This trend was confirmed in our Summer 1983 Survey, and shows the high level of desire and motivation among participants. They were seeking training and education despite financial constraints. That motivation can be considered even stronger when one considers the pay levels of staff in this sector and the volunteer status of board members. (One of Project Manage’s surveys during the demonstration period resulted in publication of a Wage and Benefit Survey of the non-profit sector, the first done in Canada. The compensation level in this sector was found to be even lower than popular belief has assumed.)3
To move beyond demographic description to a substantive assessment of Project Manage, three major surveys were conducted, and these were supplemented by a variety of smaller evaluations. As noted earlier, the first four months of the project were devoted to developing a database on Manitoba non-profits. In addition to providing information about needs, these data gave us pre-test information on a group of organizations that subsequently self-divided into users and non-users. (About 25 per cent of the organizations included in the initial baseline survey subsequently used Project Manage services.)
The second major data collection resulted in the Wage and Benefit Surveyconducted in the summer of 1982. The resulting publication was widely sold and distributed, and this collection also allowed us to make a year-two update of our baseline data. In the summer of 1983, we surveyed a random stratified sample of users to attempt to assess medium-range outcomes, and also surveyed nonusers from the initial control group established through the 1981 baseline survey. This survey was supplemented by intensive case-study review of 12 user organizations. Finally, the Wage and Benefit Survey was updated in 1985.
For individual general registration and in-house courses, we also used the standard “happiness ratings” which most continuing education courses solicit immediately following the instruction period. Some 95 per cent of course registrants indicated that they would recommend the courses they had taken to others. We also attempted some individual pre- and post-testing on specific courses, where grant budgets permitted. For example, we conducted a major pre—and post-test on a conflict management seminar for executive directors of day care centres. Directors and staff members were surveyed for behavioural characteristics prior to the course and four months later and some statistically significant improvements in conflict-management behaviours were reported.
The wide range of data that we had collected allowed us to attempt to measure program outcomes at several different levels and also to look, in a modest way, at such interesting questions as “why do people decide to take which courses?” Generally, we found that when an organization has numerous management needs, including those of a long-term nature, it opts for training to meet immediate requirements (e.g., accounting, word processing).Ifthe organization is not at high management risk, it may feel freer to concentrate on long-term and board training and development. This crisis-oriented approach to course selection or skill development is partially explained by non-profit compensation programs, i.e., these organizations are often not able to attract highly-trained people because of their low salary and benefits levels. Basic administrative skills are lacking and these essentials must be developed first to avoid daily chaos. The long-term problems can wait, from the organization’s point of view, even though the organization has identified major long-range training issues in the areas of planning and evaluation.
These hypotheses were supported in the case study review of client organizations, in which we attempted to determine the level of organizational turbulence at the time training was sought and to correlate that turbulence or”chaos” with course selections. We developed a “chaos scale” of six weighted indicators of organizational turbulence: executive director turnover, staff turnover, board turnover, operating budget deficit, program changes, and cutbacks. A behavioural scale was developed within each of these indicators and the selected organizations were assessed against it. We found that intensive, long-term team-building, organizational-development and planning-skills training were sought when the organization either was at an extremely high level of chaos, or when the organization was operating at an extremely low level of chaos. Moderate levels of turbulence (in which the rankings were designed to reflect average organizational condition) consistently resulted in training choices which were short-run and directed to the development of individual skills and which were likely to have a lesser organizational impact.
Assessing Impact The critical question at the end of the demonstration period, of course, was: did Project Manage have an impact on its user organizations? What difference did it make that this training/consultation resource was available to voluntary nonprofits? Such a difference is always extremely difficult to measure, especially over a short time. Ergo, this paper is about immediate outcomes, not distant impacts.
At the most superficial level, our marketing and training evaluation surveys indicated that we had helped organizations believe, by our presence, that nonprofits could become more effective through training, development and consulting support. We made it easier for organizations that wanted to become more effective to do so, because specific guidance and training were easily accessible, at lower costs than in the open market.
Also, in the survey research conducted in the summer of 1983, we attempted to assess randomly subsequent outcomes a considerable time after training had been delivered. Of those users surveyed, 86 per cent who had taken individual skill training directed at staff members reported specific impacts on performance. These impacts were varied and included such results as a more effective secretary, or improved competence among professional or administrative staff. Of those organizations surveyed which had selected in-house training for board and staff which was targeted to long-term issues, 64 per cent felt that the training had had an impact on their performance. Another 20 per cent felt that their boards had chosen subsequently not to change behaviours which had been pinpointed as problem areas through the training and consultation period. Team-building and role clarification were most frequently cited as auxiliary benefits.
In the summer of 1983 we returned to both the control group of non-users and the stratified random sample of users to attempt to assess organizational changes in performance in 14 key managerial areas (e.g., financial management, staff management relationships, board leadership, committee operations, service delivery, organizational planning capacity, administration of personnel practices, etc.) We surveyed non-users to chart changes in performance indicators in these managerial areas during the previous three years; then we did the same for users. In 12 key managerial areas (membership/audience development and committee operations were the only exceptions), more users than non-users reported improved performance characteristics at statistically significant levels during the three-year period. We then returned to the user data and matched performance indicator changes against the content of the training provided. We were delighted to find that 90 per cent of the organizations which had received training in a specific area demonstrated improved managerial performance in that area.
In summary, we used a variety of objective and subjective methodologies to attempt to assess outcomes. In terms of the initial goals and objectives of the demonstration, we were successful in:
• demonstrating that there was a need for management training in the nonprofit sector,
• demonstrating a variety of training models and assessing their varying outcomes upon this sector;
• demonstrating that training can achieve some positive organizational change, even when it is a minor intervention.
Beyond these tentative conclusions we could not go. We ran into the traditional problem of evaluating service outcomes, particularly educational outcomes, when the product is intangible, inseparable from its source, variable and highly perishable.
Lessons for Followers On the Way …
The Project Manage demonstration program was an unusual venture for a university continuing education division. We do not know of any similar program in Canada, although some other divisions are offering periodic seminar courses directed to the non-profit sector, and this is also being done by some community colleges and other educational institutions in collaboration with community agencies. However, an integrated service-delivery and training model did not, and does not, exist elsewhere.
Project Manage proved extremely educational for the Division, in particular for the management education department. A number of the marketing, program evaluation, and modular curriculum development lessons we learned have now been more widely implemented in the full range of management-certificate and seminar programs. The public relations impact for the university was at least as profound. More than 60 per cent of the 5,500 program participants were voluntary board members. All of them have other employment and community connections which make them potential consumers of university continuing education in other categories. The “goodwill” factor should not be underestimated by any university continuing education division which decides to offer programs to the voluntary sector.
The Division made the decision to continue Project Manage an additional year, beyond its three-year demonstration, however the model was modified as a result of resource constraints which, we hoped, would be of a temporary nature.
For other faculties interested in developing similar programs for the voluntary sector, a number of potential minefields should be noted:1. There is the question of money. It cannot be said too often, that participants from the voluntary sector can seldom pay their own way. The Project Manage model described cost between $70,000 and $75,000 a yearin basic overhead, before program expenses were incurred. We were able to charge fees that covered variable program costs although the argument could be made that even at an average of$45 a day we were cutting out many needy clients. Start-up expenses and basic overhead must be covered from other sources if the funding situation for non-profits is the same in other communities as it is was, and is, in Winnipeg and in Manitoba. Foundations and government departments are all prospective sources but, in these perilous times, this approach to funding only works if you can put together a consortium of groups-all of whom will give their little piece if everyone else gives theirs! It is most difficult to convince governments that this is an appropriate activity because they like to fund direct services rather than indirect services to the community. They are often willing to help finance curriculum development, but they are never as willing to pay the costs of delivering the curriculum once it is set. It can also be difficult to put together a consortium because of non-profit politics.
2. There is this matter of politics. Any continuing education department which operates under the naive notion that the voluntary sector is a collection of well-intentioned, pure-in-heart doers of good, will repidly become disabused of this perception. This sector fairly seethes with Machiavellian machinations, particularly in times of heavy resource constraint. Questions of turf, funding and protocol all arise with great frequency and ferocity. Any university which becomes involved with the sector should cover all its political bases in the construction of its advisory committee and operating proceduresand, even then, should work on developing a thick hide to insulate it against institutional backbiting. The students-both board and staff-who will participate in your programs are wonderful. They are keen, enthusiastic and supportive and will be your best promoters. But watch out for those groups who feel that training should be their preserve, particularly if they disburse any funds (however small) in the community.
3. You will have to contend with the effects of”history”. There will be many variables in designing and implementing a voluntary sector program which you will not be able to control. An example, in the case of Project Manage, was the introduction of the provincial payroll tax. It had a profound impact on our marketing and delivery design, yet it was a government action whichon the surface-appeared to be totally unconnected to training and development activity. Similarly, a decision by a major funder to change granting priorities, or the introduction by an outside agency of a competitive training program for voluntary boards could all have outcomes which are potentially negative for your program. A list of potential “history” effects is almost endless; no matter how well-designed or well-intentioned your program is, it will be a “minor intervention” and affected by all sorts of unexpected shifts in the environment.
Is the picture too bleak to contemplate? No, it is extremely rewarding to work in the voluntary sector if you are not interested in making money or in living a quiet life. It is extremely rewarding in terms of audience reception, and the scope for creative and experimental program design and delivery. If there are minefields, there is also an appropriate strategy for finding your way through them! The successful program for the voluntary sector will be characterized by at least four qualities:
1. It will have political resources. The programmer and advisory group will have to have community political clout and it will have to be well understood in your community that voluntary sector training and development is your territory and that no significant changes in the resources directed to that area can be made without your tacit institutional consent or, alternatively, without your exacting a price for the change. To exercise that kind of clout you will have to have analytical resources—a good”ear”for the community-so that you will know when difficulties are brewing and can take the appropriate research and preparatory steps to counter them. Your program objectives may be pure and good, but you will also need good skills in bargaining and maneuvering, and the pulling and hauling of the policy-implementation process, which can best be described as “a system of pressures and counterpressures.” When necessary, you will need to be able to muster the community troops required to protect your territory.
2. Your program staff will need keenly honed administrative skills. There will never be enough money to run this program properly, so the programmer in charge must be able to take several people’s interests and several people’s small bits of money and put them together to achieve program goals. As a corollary to funding skills you must also have quality control and good administrative systems. Be ready to consider how you can use volunteers, extract service-in-kind resources from private firms, and work in formally contracted situations with other non-profit providers of funds or services.
3. Choose program staff who are “great protectors”.4 This is no arena for the meek. Although your political base for the program will have to be built into your advisory committee, your major staff programmer should be a strong public defender of voluntary sector training and development and of the university’s role in this function. A programmer should be willing to rally to institutional defence whenever that is appropriate, and should know how to rally community political supports to make the program “go”.
4. Program staff should be skilled at “fixing” the various “implementation games” which may arise as the project progresses. Bardach details the range of “implementation games” which can afflict any project or program in great detail, including the following: diversion of resources and budgets, deflection of goals, dilemmas of administration, dissipation of energies, etc. Bardach’s detailed discussions of all that can go wrong in implementing a program, and how to foil”implementation games” (more systemic than malevolent) should be required reading for any programmer working in this area.
Is it a good thing for a university continuing education division to offer an integrated programming service to the voluntary sector? Can such a program have significant results? Can it survive? The answers depend both on your frame of reference and your risk profile. There are problems and hassles; more than equally, there are significant rewards. Painful as it can sometimes be to operate in this sector, it would be difficult to find a more challenging or a more appreciative audience.
A Brief Epitaph
Project Manage closed its doors at the end of June, 1985, a victim of further budgetary cutbacks. For the record, it went with a bang, not a whimper. We held a community “wake”, complete with black-edged invitations, for all community members, key volunteers, instructors and faithful clients. A coffin of publications was tastefully illuminated by black candles and decorated with funereal black balloons and guests were asked to don black armbands as they came in the door.
Eulogies were delivered to the accompaniment of good food, appropriate Irish fiddle music and suitable libations. The closing thus left a good ‘”aftertaste” in the community, left opportunities for continued involvement in the sector open, attracted a lot of favourable media attention and even inspired a grilling of the Minister of Education who was asked, in the Manitoba Legislature, to explain why “such a worthwhile project was subject to these cuts.”
Subsequently, the same community groups who had co-operated to form Project Manage mustered resources to mount a mini-project that was designed to retain a reasonably priced management consultation program for non-profits and also had some board-training resources. The succesor program opened its doors at the Volunteer Centre of Winnipeg in May, 1986 with the University of Manitoba’s advice and support. The university itself has also implemented a threeyear certificate program in the management of non-profit organizations which will enrol its first students in September 1986. This new program is directed to executive directors and other senior managers and, like Project Manage, has a broadly based community advisory group.
Nevertheless, there is no question that some groups have been excluded from management training by PM’s closing. Some audiences are no longer served at a price they can afford and some can no longer find the specific training they need. However, major community efforts have succeeded in providing a ..smorgasbord” of more limited management-training supports for the sector. The impact is not expected to be as great but there is an element of cynical optimism at work. In any case, the community infrastructure is not sufficiently strong in these perilous funding times to provide the resources for a ‘”one-stop shopping” model of non-profit training support.
That is the realistic assessment which provides the “cynical” part of the statement. The “optimistic” half believes that even with the projected model-both smaller and more dispersed-Winnipeg non-profits are better served in this program area than are the non-profits of any other major Canadian city.
An historical community value of co-operation was strongly reinforced (former Winnipegers will know whereof I speak) during the planning and implementation of PM and this has helped to bring the new models to fruition.
Was Project Manage A Success?
I believe it was. For one thing, it did generally good work while it survived and provided valuable information because we made a real effort to track and document what worked and what didn’t. This information was shared, and provided a planning base for future activity. Perhaps the greatest tribute to Project Manage came when its more modest successor was introduced to the community. Its name? Manage II of course!
FOOTNOTES
1. Tim Phail and Jacke Wolf, “Using Network Analysis in Human Service Planning: Fitting the Pieces Together,” Business Quarterly, Vol.50, No.3, (London: University of Western Ontario School of Business).
2. Thomas R. Dye, Politics, Economics and the Public: Policy Outcomes and the American States (Chicago: Rand McNally, 1966), pp. 1-3.
3. These trends were confirmed in a second Wage and Benefits Survey published in June 1985.
4. Eugene Bardach, The Implementation Game (Cambridge: The MIT Press, 1977), pp. 26-27.
5. Ibid., Ch.2-3.
BIBLIOGRAPHY
1. Bardach, Eugene. The Implementation Game: What Happens After a Bill
Becomes Law. Cambridge, Mass.: The MIT Press, 1977.
2. Dye, Thomas R. Politics, Economics, and The Public: Policy Outcomes in the American States. Chicago: Rand McNally, 1966.
3. Gross, Neal; Giacquinta, Joseph B.; and Bernstein, Marilyn. Implementing Organizational Innovations: A Sociological Analysis of Planned Educational Change. New York: Basic Books, 1971.
4. Levy, Frank; Meltsner, Arnold J.; and Wildavsky, Aaron. Urban Outcomes: Schools, Streets and Libraries. Berkeley: University of California Press, 1974.5. Pressman, Jeffrey L., and Wildavsky, Aaron. Implementation: How Great Expectations in Washington are Dashed in Oakland: or Why It’s Amazing That Federal Programs Work At All. Berkeley: University of California Press, 1973.
6. Weiss, Carol. Evaluation Research: Methods of Assessing Program Effec
tiveness. Inglewood Cliff, N.J.: Prentice-Hall, 1972.
7. Wildavsky, Aaron. “Evaluation as an Organizational Problem.” University
Working Papers 13, London: Centre for Environmental Studies, 1972.
8. Wildavsky, Aaron. “The Self-Evaluating Organization.” Public Administration Review 32 (Sept/Oct., 1982): 509-520.
JACKE WOLF
Assistant Professor and Head, Management Studies, Continuing Education Division, University of Manitoba