SURVEYING JOB VACANCIES IN LOCAL LABOR MARKETS: A HOW-TO MANUAL


prepared by the University of Wisconsin-Milwaukee Employment and Training Institute, December 1998

SUMMARY

This manual on how to conduct job vacancy surveys was prepared under contract with the Employment and Training Administration of the United States Department of Labor (#X-6752-8- 00-80-60) so that state and local policy makers can benefit from the experience of the Milwaukee project and previous job openings surveys conducted in the United States and other countries.

The University of Wisconsin-Milwaukee Employment and Training Institute has conducted job vacancy surveys twice annually since May 1993 for use in program planning, training and policy analysis. The surveys grew out of the interest of local governments and schools in job availability data at the local level. The surveys are funded by the local governmental partners (the City of Milwaukee, Milwaukee Area Technical College, Milwaukee Public Schools, Private Industry Council of Milwaukee County, and University of Wisconsin-Milwaukee), the Helen Bader Foundation, Milwaukee Foundation and U.S. Department of Housing and Urban Development. The University's Institute for Survey and Policy Research assists in drawing the sample, data entry and telephone follow-up. Operational uses of survey results by the governmental partners require detail on occupations, job location, skill requirements and identification of "difficult to fill" jobs within the four-county metropolitan area.

Job vacancy survey design, sampling, methodology, weighting, survey administration, data verification and data analyses issues are described in detail in this manual. Uses of job openings data to assess spatial and skills mismatches within subareas of the labor market and to target training and transportation strategies for workers are also described.

CONTENTS

What We Have Learned from Prior Job Vacancy Studies

  1. The Purposes of Vacancy Studies
  2. Background on Job Openings Surveys in the United States
  3. Vacancy Surveys in Other Countries
  4. Employment and Training Institute/ISPR Survey
  5. Job Vacancy Survey Instrument and Technical Approach
    1. Questionnaire Design
    2. Vacancy Definitions
    3. Printing, Mailing and Postage Issues
    4. Address Correction Problems
    5. Suggestions for Handling Returned Mail
    6. Telephone Contact Problems
    7. Response Data
    8. Surveying by Phone, Mail or Both
    9. Suggestions for Follow-Up Telephone Calling
    10. Using Internet Listings of Job Vacancies
    11. Data Entry and Coding
    12. Identifying Dead Companies
    13. Sample Selection
    14. Weighting


WHAT WE HAVE LEARNED FROM PRIOR JOB VACANCY STUDIES
  1. Most countries which have used job vacancies as an economic indicator do not survey establishments but instead analyze job openings listed with the public employment service. However, employer utilization of the public employment service varies widely. Even in countries where listings with employment service are mandatory, coverage may not be complete. National vacancy surveys using a sample of establishments have been conducted in the United States, Australia, United Kingdom, The Netherlands, South Africa, Canada and New Zealand.

  2. Vacancy surveys should be kept simple and designed to reduce the reporting burden on the employer as much as possible. Many large firms already prepare weekly job listing sheets which provide detail on the number of full-time and part-time openings by occupation. Adding survey questions on job turnover, duration of vacancies, etc. (which require action by payroll or other departments in the establishment) may reduce response rates, particularly for large establishments and companies with high turnover of workers.

  3. Annual or occasional (rather than monthly or quarterly) administration of vacancy surveys may provide sufficient data for program and policy concerns at the local level. Less frequent surveying may help improve response rates.

  4. Use of existing vacancy surveys and definitions can avoid time consuming survey design work and field testing. The Job Opening Pilot Program (JOPP) and the Employment and Training Institute (ETI) surveys used the vacancies portion of the Job Openings and Labor Turnover Survey (JOLTS) and definitions, which are recommended here. Questions can be added or deleted, but it is recommended that the design remain simple and straightforward.

  5. While survey requests for openings data for an entire month or bi-weekly pay period may yield interesting information, they will require considerably more effort to complete than a request for a listing of job openings on a specific day. Vacancy survey data on "difficult-to-fill" positions, prerequisite skill training, rate of pay, and full- or part-time status of openings can help policy makers assess labor shortages by sector and identify occupations with high turnover (e.g., entry-level jobs in the retail and service sectors).

  6. The experience of the ETI, JOPP, ETJO (Employee Turnover Job Openings), and SOLD (Australian Survey of Labour Demand) surveys suggests that a combination of mail and telephone contacts may be most effective. Smaller companies are more likely to respond via telephone as they seldom have openings and may feel it is not important to return the survey unless they have openings to report. Smaller companies may view the mail survey as a nuisance, but may not mind a short telephone survey. On the other hand, larger companies respond at higher levels via mail and are more difficult to contact by phone. Stand-alone mail surveys are not recommended due to low response rates and the likelihood of non-response bias.

  7. In a stratified sample, cells which have low sample size and/or response rates may result in high weights and error levels, suggesting the need to draw a larger sample and/or solicit higher response rates through mail and phone follow-up. Sample selection methodology and response rates will impact on the weights used and related error rates, particularly for smaller establishments where openings are less likely to occur.

  8. A comparison of the ES-202 file which is a common source for Department of Labor establishment surveys and GENESYS (a commercially available yellow pages listings file) was conducted to determine the coverage of both files as a source for the survey sample population. The yellow pages telephone-based listing was found to include considerably more listings, particularly for smaller companies. The yellow pages file is also much more likely to have a correct address and telephone number, particularly for multi-site and franchise establishments. A yellow pages database (InfoUSA, formerly ABI) is currently used by the Employment and Training Administration and state employment service as a source of employer information for job seekers. The cost of the yellow pages database may be offset by the time and effort required to clean up the address file and locate phone numbers omitted in the ES-202 file. For those states which do not provide the ES-202 file to researchers, the yellow pages file is recommended.

  9. A letter from the parties cooperating in a local survey detailing the purpose of the survey may enhance response rates via mail and at the same time provide the promise of confidentiality. Correspondence to employers should avoid reference to the public employment service since some establishments may not want their positions listed with the public employment service and consequently may not respond if there is a suspicion that this may occur.

  10. Involvement of local governments and educational institutions in the development of job openings survey projects can improve the level of employer cooperation with the survey and the subsequent uses of the data for public policy, establishing priorities for education and training programs, and counseling job seekers. In Milwaukee the survey request is accompanied by a letter from the president of the technical college, chancellor of the university, mayor of Milwaukee, executive director of the private industry council, and president of the major local foundation supporting the research.


Comments should be addressed to John Pawasarat, Employment and Training Institute, University of Wisconsin-Milwaukee, 161 W. Wisconsin Avenue, Suite 6000, Milwaukee, WI 53203. PHONE (414) 227-3380, FAX (414) 227-3233, EMAIL pawasara@uwm.edu, WEBSITE www4.uwm.edu/eti
I. THE PURPOSES OF VACANCY SURVEYS

Vacancy statistics have been collected by dozens of industrialized nations since at least the 1950s when skilled labor shortages and low unemployment rates prompted use of vacancy data to provide an economic indicator, address the labor exchange function, provide analyses of labor market conditions, and identify and forecast occupational shortages. Since the 1940s and 1950s there has been interest in using vacancy data to examine structural as well as frictional unemployment and as a measure of full employment comparing unemployment numbers and openings to a gauge of economic well being. At local levels the public employment service sought data on job openings as a tool to more effectively target training efforts for unemployed and under-employed workers, as demand for skilled positions increased and unskilled employment decreased. During the 1980s scholars stressed the importance of collecting and analyzing vacancy data to compare the number of unemployed and underemployed workers to openings (Riemer, 1988; Abraham, 1983; Levitan and Gallo, 1989; Holzer 1989). More recently, employer vacancy surveys have been used to assess geographic and skills mismatches (Holzer 1996). The combination of demand information available through establishment vacancy surveys combined with supply data household surveys can provide policy makers, employers and educators with analyses more suited to solving the employment needs of those seeking work or expected to work in the context of job openings locations and the skill levels required. The purpose of the survey, of course, drives the design and geographic coverage.

Vacancy data are commonly used as economic indicators at the national level. The three sources of data typically used to measure vacancies are public employment service job listings, want ad listings and establishment surveys. Use of job listings filed with local and state employment service offices is common but problematic, particularly in countries where such listings are not mandatory. Many companies do not list with the public employment service, particularly in urban areas where alternative placement sources are more readily available. Employment service listings also tend to retain job openings and may not be purged until well after a position has been filled, with no point in time measure possible beyond the opening or closing date of the listing. The number of internal hires and professional positions not recruited through employment service may tend to skew occupational demand as well.

Counts of want ad listings are often used at the local, state and national level to provide an inexpensive economic indicator and to identify vacancy trends, job shortages and occupational supply and demand. Want ads have some obvious limitations: the job location and number of openings are often not specified, some companies do not recruit through want ads, the positions shown may not be available for immediate hire, and some vacancies may not be listed. Despite these limitations, want ads are commonly used as a barometer of employment demand. The Conference Board uses the want ads index as an economic indicator at the national level and it has been found to be a useful indicator of vacancies (Abraham, 1987). In Australia a monthly Skilled Vacancy Survey index is based on a count of vacancies for skilled occupations listed in want ads of major metropolitan newspapers.

Establishment surveys of vacancies used to assess job gaps and skills shortages are typically conducted in local labor markets where public policies and training interventions can consider supply and demand mismatches. These local "applied" uses often require analyses at an occupational level by industrial sector. Other applied uses cited for locally administered vacancy surveys include:

Interest in a vacancy survey in the Milwaukee area was initially driven by the City of Milwaukee Fair Housing and Employment Commission's concern about the number of jobs compared to job seekers in central city neighborhoods of Milwaukee. (This interest paralleled the focus of 1940s full employment measures in the United Kingdom, Canada and the United States.) Previous local attempts to estimate openings relied upon a count of employment service job listings and used a multiplier to estimate jobs not listed. The Milwaukee Area Technical College expressed a willingness to participate in a pilot project, with an interest in using the results to improve its instructional offerings, identify emerging occupations and target high demand occupations. The Private Industry Council of Milwaukee County has used the survey results to determine training areas for its subcontractors.

The Milwaukee Public Schools' interest in job openings data was for counseling students about the labor market and career education. In cooperation with the Private Industry Council's Milwaukee Career Center and Milwaukee Area Technical College, the Employment and Training Institute designed a series of booklets and curriculum guides to present information on the labor market for students and new entrants into the labor market. A booklet on high demand jobs for skilled workers highlights the jobs and pay rates for careers requiring one to two years of post- secondary education and a second booklet profiles careers requiring four or more years of college. Both combine vacancy data detailing high demand occupations, hourly rates of pay and location of jobs with occupational outlook information, position descriptions and training programs at the postsecondary level. Booklets are distributed to students through the career education program, school-to-work coordinators and high school counselors.

On a policy level the zipcode location of job openings has helped focus local and state policy discussions on the geographic mismatch between job seekers in the central city where openings are limited and job availability in the outlying areas where demand has remained high and unemployment is at the 2 to 3 percent level. Recent debates over welfare reform initiatives have used vacancy survey data to focus discussion on skill level and geographic mismatches facing new entrants into the labor force. Transportation policies for mass transit and welfare reform initiatives have been modified and targeted to geographic areas where demand is high for full- time positions. Job training programs for welfare employment initiatives have used vacancy data to identify occupations where short-term training could be offered for welfare recipients entering the labor force.

II. BACKGROUND ON JOB OPENINGS SURVEYS IN THE UNITED STATES

Over the past fifty years the United States has conducted a series of pilot projects, experiments and feasibility studies related to vacancy surveys and job openings. (For a discussion of the period 1940-1978, see Appendix IV.) The United States Department of Labor (USDOL) conducted pilot projects to test the feasibility of collecting job vacancy data as early as World War II and the Korean War. In 1956 the USDOL Bureau of Labor Statistics conducted a feasibility project which found that employer records on vacancies were unreliable and that many employers were unable to report the number of vacancies.

In the mid-1960s, when unemployment rates were very low and the number of job vacancies may have equaled the number of unemployed persons, a common view among academics and policy makers was that vacancies beyond normal turnover were the result of the unemployed not knowing how to find available jobs, with job seekers mainly needing mechanisms to link up with the companies advertising vacancies (Abraham, 1983). However, perceived "skill shortages" in higher skilled occupations were being reported at the same time that lesser skilled workers in manufacturing remained unemployed.

This renewed interest in job vacancy surveys prompted another pilot project to assess the feasibility of conducting vacancy surveys and to examine the use of vacancy statistics in other countries. By 1965 the Experimental Job Vacancy Program included sixteen cities. The purpose of the program was not only to test the feasibility of gathering vacancy estimates by occupation but also to examine the usefulness of the results for improving the labor exchange function, identifying occupational training needs and informing economic policy analysis. This experiment was subsequently expanded nationally.

In the period 1969-1973 the Bureau of Labor Statistics (BLS) established the JOLTS (Job Openings and Labor Turnover Survey), a state administered survey that collected data primarily in the manufacturing sector. Throughout the 1970s, Wisconsin and Minnesota continued to collect job vacancy data despite federal lapses in support of the program. The JOLTS regularly surveyed manufacturing establishments to obtain data on long-term and short-term job openings, new hires and separation rates. The survey did not collect data on full-time/part-time status, wages or benefits, but provided valuable economic indicators to gauge local labor market trends. However, funding for this joint federal and state effort was discontinued by the federal government. The JOLTS survey was last administered in December 1981.

In 1979-80, the BLS Job Openings Pilot Program (JOPP) conducted pilot programs in four states to assess the feasibility of collecting data nationally. A series of three surveys over six quarters collected data on vacancies (quarterly) and labor turnover (monthly) to test the feasibility of collecting establishment data using a variety of methods. Examinations of sampling size and measurement error led to recommendations on sample size and estimates of costs. (See Appendix V for a detailed discussion.) A subsequent recommendation was made that such surveys were possible but too expensive to administer. The program found that telephone interviewing was effective with establishments with less than fifty employees while mailing to specific individuals was recommended for larger establishments. Use of job openings data at the occupational level was associated with very high sampling error, and the program recommended that much larger samples be used to provide reliable estimates.

In 1991 the BLS embarked on another similar pilot project, the Employee Turnover Job Openings (ETJO) experiment, driven by a renewed interest in identifying occupational labor shortages and determining where hard to fill openings were occurring (BLS, 1991). Driven primarily by a Congressional directive to "...develop a methodology to annually identify national labor shortages" and secondarily by immigration policy concerns related to occupational demand, ETJO was designed to assess the use of computer-assisted telephone interviewing (CATI) techniques and the Occupational Employment Statistics (OES) survey to conduct vacancy surveys. Once again this pilot project found that vacancy surveys with occupational detail could be conducted at a national level but even if only conducted annually would be very expensive.

III. VACANCY SURVEYS IN OTHER COUNTRIES

National vacancy surveys using a sample of establishments have been successfully conducted in Australia, United Kingdom, The Netherlands, South Africa, Canada and New Zealand. See Appendix VI)

A. Australia

The Australian Bureau of Labor Statistics has conducted the Quarterly Survey of Job Vacancies and Overtime (JVO) since 1983. This telephone survey is used to estimate vacancies at the state and national level. The vacancy estimates are used as a major economic indicator and for forecasting. While the JVO performs similarly to Australia's want ads index as an economic indicator, it provides data by industry and state using a far superior methodology. Employer participation is mandatory and the response rate is 99 percent. The survey is simple and asks for only four numbers (i.e., total number of employees, paid overtime hours, number of employees paid overtime, and number of job vacancies). Survey forms are mailed out prior to the phone survey. Response time varies by size of establishment from 7 minutes for establishments with less than 5 employees to 19 minutes for establishments with more than 20 employees. The sample size is designed to have a relative standard error of 25 percent or less at the state level and 10 percent or less at the national level. (See the full "Economic Overview" report in www.treasury.gov.au under "publications," "economic publications," "Economic Roundup.")

The sample is stratified by size, industry, state and private/public establishments. All establishments with 100 or more employees are surveyed along with a sample of establishments with less than 100 employees. Turnaround time is six weeks after the survey date and statistics provided include estimates of vacancies, job vacancy rates and overtime levels.

In 1991 Australia conducted a study to determine the feasibility of collecting job vacancy data by occupational category with particular emphasis on difficult-to-fill vacancies. The Survey of Labour Demand (SOLD) study found that a mail-out survey was feasible, reporting that the mail survey is "less expensive and provides better results than a telephone collection. The telephone methodology is satisfactory for seeking data at aggregate level on vacancies but it is doubted that it would be sufficiently robust to give reliable data at a dissected level." The SOLD study concluded that because the JVO survey asks only four questions it could be conducted by phone, but that adding more questions would not be feasible for a phone survey. The SOLD survey, however, is no longer in operation and vacancy occupational data are now collected for skilled occupations using the want ad based skilled vacancy survey. (See www.deetya.gov.au/aed/svs/svshome.htm)

B. United Kingdom

"Skilled Needs in Britain" is published annually ( www.open.gov.uk/dfee/skneeds) detailing skill needs and training. Based on a survey of 4,000 employers with 25 or more employees, results are provided by sector size and region, detailing hard-to-fill vacancies, participation in training programs and occupations in high demand. Listings with the public employment service are also reported monthly on placements and vacancies.

C. The Netherlands

Quarterly establishment surveys have been conducted since 1980 in the Netherlands to gather data on vacancies, new vacancies, filled vacancies and level of education by industry and establishment size. Survey results are used as an economic indicator and for counseling new labor force entrants. Public employment service listings and want ads are also used to examine vacancies. However, in the Netherlands companies are not required to list jobs with the employment service.

D. South Africa

A manpower survey is conducted annually covering the number of workers in fixed occupational categories and detailing the number of vacancies in each occupation. In 1994, 8,810 firms were involved in the survey. (See www.statssa.gov.za , Statistical Releases, Labour, Manpower Survey.)

E. Canada, New Zealand

Canada and New Zealand have conducted job vacancy surveys to identify skills shortages for immigration policies. Canada abandoned its survey in 1978 and New Zealand in 1989.

IV. EMPLOYMENT AND TRAINING INSTITUTE/ISPR SURVEY

The design of the University of Wisconsin-Milwaukee Employment and Training Institute (ETI) job openings survey draws upon the experience of JOLTS, JOPP and the more recent Department of Labor survey of Employee Turnover Job Openings (ETJO). The ETI survey include data on full-time and part-time employment, the location of employment, wage rates, availability of fringe benefits, education and training requirements, and employer-identified hard- to-fill openings.

Information on the job titles of openings is particularly important to determine occupational needs of employers and availability of entry-level positions which may offer steps for promotions within companies. The location of employment is essential not only for the value it may have for descriptive purposes but also because companies often have sub-units throughout the state or region for which hiring and wage reporting is included. Job site data provide a check to ensure that metropolitan employment trends are based on openings in the specified four-county area rather than a reflection of broader geographic units covered by the establishment's administrative office. Data on minimum levels of education and special training required for vacancies provide benchmarks from which to gauge possible training and placement efforts. Fringe benefit availability is examined as an important determinant of adequate employment and family-supporting jobs.

For this survey the Employment and Training Institute works with the University's Institute for Survey and Policy Research (ISPR, formerly the Social Science Research Facility) to draw a representative sample of employers from the covered employment ES-202 file for the Milwaukee four-county metropolitan area. The methodology incorporates a stratified sample by size of company and Standard Industrial Classification (SIC). Respondents are asked to complete the information based on actual openings in a specified week (e.g., the week of October 19, 1998).

In order to improve initial response rates and to minimize the employer effort required, the ETI survey uses a one-page form limited to data on vacancies and new hires only. The one page form has been achieved by asking companies to simply list job titles for which there are current openings and using University staff to code and categorize occupational listings. Establishments are provided return mail envelopes addressed to the University.

The sample population is stratified by size and type of company using the following criteria to ensure an adequate response rate and over-sampling of large employers, government and education institutions.

  1. 100 percent sample of employers with 250 or more employees.

  2. 100 percent sample of local government and education institutions.

  3. A 10 to 12 percent sample of the balance of corporations.

  4. Temporary help agencies and estates are excluded.

The mail response rate for the survey is typically 20 to 25 percent. In order to increase the response rate and to test whether non-respondents differ in any way from respondents, Employment and Training Institute staff attempt to contact approximately 2,000 of the non- respondents by phone to solicit their response to the survey. Phone contacts are attempted for all non-respondent companies with over 250 employees and for a sample of the balance of non- respondent companies. The initial mail respondents for the May 1993 survey were compared to a randomly selected population of non-respondents who were solicited by phone. The two groups were analyzed to test for differences in reported job openings. Only a few of the stratified cells showed any difference and analysis of the overall sample showed no statistical difference in the two populations. A test for differences in means was conducted for number of job openings with no significant difference evident between groups.

Results for the sample population are weighted by size, type of industry and response rate to project the total number and type of jobs available in the metro area. Response rates for questions concerning hourly wages and qualifications are lower than for data on type of job openings due in part to missing data attributable to jobs where salary is based on commission. As a result, two additional weighting formulas are applied to adjust for missing data. The three weights used to project full-time and part-time openings are: 1) the total population responding to the survey, 2) the population detailing wage rates, and 3) the population completing questions on fringe benefits and job requirements. The use of these different weighting formulas permits more reliable estimates but results in slightly different totals across tables for both full-time and part-time openings.

The combination of non-response rates and low job opening rates for some cells results in extremely high weights for a small number of cells which could distort findings on the number and types of jobs open. A test is conducted on each weighting cell by both type of company and size, and those responses more than one standard deviation above expected are excluded from the analysis. (The resulting weighting methodology was tested by using the sample data to project overall employment for the SMSA and comparing these projections with published May 1992 ES-202 employment total. Estimates of employment based on weighting of survey respondents' reported current employment levels fell within 10 percent of the actual reported levels for the metropolitan area, lending confidence to the weighting procedure used.)

After survey results are tabulated and weighted by size and type of industry and by response rate to project the total number and types of jobs available in the metropolitan area, a fifty-page report is prepared for the government partners summarizing the findings. Analysis is provided in the following areas:

A 4-6 page summary paper identifies key findings, along with listings of occupations with labor shortages. The summary is distributed to government officials, educators, community agencies and resident organizations in the Milwaukee area and is posted on the Internet. (See www4.uwm.edu/eti)

V. JOB VACANCY SURVEY INSTRUMENT AND TECHNICAL APPROACH

To assist state and local planners in developing job vacancy surveys for their local labor markets, information is provided on survey design, sampling, methodology, weighting, survey administration, data verification and data analyses.

A. Questionnaire Design

Current and past surveys of job openings often consist simply of a count of vacancies combined with additional questions about the establishment's workforce. In the JOLTS, JOPP, ETJO and SOLD surveys, labor turnover questions are also examined at the occupational level. In Australia the number of employees, overtime hours and employees working overtime are examined. Increasing the level of detail requested may place a burden on employers and reduce response rates.

The design of the ETI survey was driven by the interests of the participating governmental partners as well as a desire to keep the survey as simple as possible. The resulting one-page form was field-tested and has been used with minor modification since 1993. Surveys are conducted each May and October to assess openings for immediate hire as of the 4th Monday of the month. (See Appendix I for the cover letter and survey instrument used in Milwaukee.)


Data Requested on the Employment and Training Institute Survey


The instructions for completing the survey are brief and include a definition of "job openings" and a note to include only those openings available for work sites within the four-county labor market. The total number of employees in the establishment is requested as a check that multi- site companies are answering appropriately. Data on the zipcode of the worksite for each job opening is used to assess the demand for workers in subareas of the labor market and serves as a further check on multi-site establishments' responses.

Recent surveys conducted by the Employment and Training Institute show that 20 to 30 percent of establishments have openings at a given point in time, with the likelihood of having openings increasing with the size of the company. Because many establishments are small (where less than 10 percent report openings), most vacancy surveys can be completed rapidly (by mail or phone) when there are no openings. Expanding the number of questions to include data which require companies to examine their wage and hour records increases the time and effort required of employers and may reduce response rates (both for the 20 to 30 percent of establishments with openings and the 70 to 80 percent of establishments with no openings). While large companies may be willing to provide job listings and data on current openings (which are often detailed on their weekly job listing sheets), they may be unwilling to retrieve data for other questions (which require additional human resource staff retrieval time or which require involvement of other company staff outside the personnel office).

The job titles are used to track emerging occupations and to identify high demand fields while not requiring employers to classify their jobs into standardized occupational groups or codes. The number of full-time and part-time openings and zip code location provide the basis for constructing the overall estimate of openings for the metropolitan area. The job site zipcode data are completed by almost all respondents who report openings, although occasionally employers in businesses like building security and home health care services report openings for the entire area (rather than by zipcode location) due to the daily movement of employees.

The remaining questions are unlikely to receive a complete response from all establishments with openings, and weights need to be adjusted accordingly. Hourly wage and salary data are often omitted with reasons including "depends," "commission" or "varies." When pay ranges are provided, the lower end of the hourly wage or salary is used, and salaries are converted to an hourly rate. Similarly, questions on fringe benefits are often missing or ambiguous (e.g., "varies," "depends" and "401K"). In some cases health insurance and pension benefits are available 3-6 months after employment (coded as "yes"). Establishment comments regarding "Prior Level of Education or Training Required" are recorded verbatim to aid in identifying the level of education or occupation-specific training required.

Employers are asked to indicate which jobs open for immediate hire are "difficult to fill" positions. Although the term "difficult to fill" is not defined, this category is helpful in identifying occupations with shortages as well as indicating the general level of satisfaction of employers with the available pool of entry-level workers.

Large companies with many openings often are willing to mail or fax job listings with FTE's (full-time equivalents) or the number of full-time and part-time openings and the job location but may be unwilling to answer questions not on their job listing sheets. In the health field, for example, hospital mergers have resulted in centralized hiring for hospitals, nursing homes and other health care facilities. These establishments report a high volume of openings on their weekly job posting sheets which offer a sufficient level of detail to substitute for the data requested on the ETI survey form. Access to job postings via fax and the internet is also increasing but the limited level of detail on number of openings and location of worksites makes electronic postings only occasionally usable.


Timeline for Fall 1998 Survey


B. Vacancy Definitions

Four examples of job vacancy definitions from past and current survey efforts are shown below. The JOLTS, ETJO and ETI surveys all ask that vacancies be counted for a specific date while the Australian JVO survey requests data for the pay period preceding or following a specific date.

C. Printing, Mailing and Postage Issues

Regardless of whether the job openings survey is conducted via phone, mail or a combination of both, it is advisable to mail out an introductory letter describing the purpose of the survey, including a copy of the survey form and providing a written commitment that survey responses of individual companies will be held in strict confidence and that no data will be released which identifies individual firms. Sponsorship of the survey by a partnership of local education, job training and government institutions with an interest in improving the delivery of training targeted to employer needs may increase the likelihood of response. A letter prepared on University of Wisconsin-Milwaukee letterhead signed by all partners has been used in the Milwaukee area survey. Using the local Private Industry Council or human resource board may also be appropriate. However, using Job Service stationery, for example, would not be advisable if employers have the impression that their responses may be listed in the employment service's job bank.

Whatever source is used to identify establishments to be surveyed, there are likely to be inaccuracies in addresses which will result in mail being undeliverable. Most bulk mail and presort services will scan addresses to flag those marked as undeliverable. Corrections can then be made to the file prior to mailing, thereby reducing mailing and handling problems. Whenever possible, addresses should be corrected prior to the initial mailing. For large establishments it is also advisable to add the name of a human resource person to the mailing address. Unfortunately, the ES-202 file provided to researchers in Wisconsin does not include the name of a contact person and "Attention: Human Resources" is used instead. Vendors who provide mailing lists often include a contact person at each establishment.

When utilizing mailed survey forms with the expectation of a mail response, a return envelope with postage guaranteed should be included. The establishment's name and address must appear on the survey form so that the survey results can be associated with the appropriate SIC code and size of employer. An accession number included on the survey form can be used to retain data on survey respondents. A windowed mailing envelope may be used if the establishment's address and accession number are printed on the back of the survey form. This simple technique allows machine stuffing of the survey (along with a cover letter to employers and a return envelope).

Mailing and printing costs will, of course, vary. When using bulk mail, it is advisable to request the "Forwarding Service Requested" option which forwards mail for up to twelve months and returns mail after that date with the new address and reason for non-delivery marked on the returned envelope. All undelivered surveys should be examined to identify correctable addresses, followed by phone calling to determine which establishments are no longer operating in the metropolitan area. Typically, 5 percent of surveys are returned as "F.O.E." (forwarding order expired) or "undeliverable" even after extensive address clean-up, and of these one-third are no longer in business.

D. Address Correction Problems

Both the ES-202 files and GENESYS files (see discussion in Appendix II.) were examined for address problems. Mailing correction software was used to flag records with incorrect addresses and correct zipcode and address suffixes where possible. In the ES-202 file for the four-county Milwaukee area, 35,546 records were checked for proper addresses: 12,416 (or 35 percent) required corrections by the mailing software and 4,942 records (14 percent) were identified as incorrect and requiring manual look-up. Using the GENESYS phone-book-based file, 51,483 records were checked: 10 percent were corrected by the mailing software and 3 percent were identified as incorrect and requiring manual look-up. Multi-site addresses are often a problem in the ES-202 file where establishments list addresses which indicate the company site location but do not provide a proper mailing address (with incorrect zipcodes the most common error). Because the state does not mail unemployment compensation material to these multi-site units, the addresses remain uncorrected in the state ES-202 file.

Because the unemployment compensation function is the primary use for the ES-202 file, many companies substitute their payroll or accounting office for their main company address. In some cases the address given is a payroll processing unit that may be out of state. Smaller companies often provide their bookkeepers' address and phone number, but with no indication that this is not the company address. (The only clue that the address and phone number may be incorrect is the use of a second line of the address with "Attn:...") When telephone follow-up calls are made, an initial verification of the correct establishment name can eliminate any confusion. However, when the survey is mailed to the bookkeeper or payroll unit, staff rarely forward the survey to the human resources office or company owner or the survey is forwarded well after the due date for responses.

For large establishments it is advisable to direct the survey to an individual in the human resources department while for smaller establishments the owner would be the appropriate contact. The ES-202 file does not list a contact person in the version researchers receive and consequently all mail is directed to "Human Resources" except for those companies with 250 or more employees which are called prior to mailing to obtain the name of an appropriate contact person. The GENESYS file lists a contact person; however, for larger establishments the contact is the Chief Executive Officer.

E. Suggestions for Handling Undelivered Mail

The post office will return mail it has not delivered for several possible reasons. These include:

Once undelivered mail has been sorted into the above categories, a search for correct addresses and phone numbers should be conducted. The phone number is needed since it is unlikely that a timely response would be received if the survey were mailed out a second time. Follow-up surveys are conducted by phone or faxed to companies with more than ten openings (after an initial phone contact). Several sources may be used in the search for a correct address and phone number.

  1. Review of the business white pages

  2. Review of directories of business pages that list for the `greater' [metro] area

  3. Review of the yellow pages if the type of business is known

  4. Use of computerized telephone listings

  5. Calls to directory assistance.

F. Telephone Contact Problems

For purposes of telephone follow-up the ES-202 file needs considerable correction prior to follow-up survey phone calling. Typically, 20 to 25 percent of phone numbers are missing in the file or listed for an area code outside the metropolitan area. Multi-sites often fall into this category because address listings of sites are the company's primary interest, not phone contacts. Locating addresses in the white and yellow pages can also be a challenge where multi-sites are identified in the ES-202 file by a FIPS code or store number and municipality code but do not show a local street address or phone, making it necessary to look up the FIPS code and then identify the location within each municipality. For fast food chains and other establishments with more than one location in a municipality and no local street address specified, one of the establishments in the municipality is randomly selected for the follow-up survey.

Companies where there is a non-working number or a disconnected number should be looked up again and if not found, put into the dead category of "no longer in business."

G. Response Data

The response rate for mailed surveys is usually 22 to 25 percent, with smaller establishments having lower response rates than larger firms. A random sample of half of the non-respondents is drawn for telephone follow-up interviews. There is typically a 50 percent plus response rate on the telephone interviews, with larger companies having lower rates of response. Firms with 250 or more employees are much more likely to respond by mail than through calling, with a 40 percent mail response rate compared to an 20 percent response rate on follow-up calling. Establishments with fewer than 5 employees typically have an 20 percent response by mail and a 50 percent response rate by phone, and companies with 5 to 250 employees typically show a 25 percent mail response rate and a 60 percent telephone response rate.

H. Surveying by Phone, Mail or Both

Use of mail surveys in conjunction with follow-up telephone surveys has proven to be a cost- effective and timely way to collect survey data. Mail responses are, of course, the least expensive method of obtaining responses. Because an introductory letter and survey form should be mailed to all sampled establishments regardless of whether a mail or telephone survey method is used, the cost of adding a return mailer with guaranteed postage is minimal, as only returned pieces are charged at a rate of $0.32. The disadvantage of using the mail method is that if employers with no openings assume that their responses are not wanted.

The sample size will also influence the choice of surveying method and the timeliness of survey responses. The longer the calling takes and the greater the length of time elapsed between the contact with the establishment and the survey week designated for openings, the more difficult it may be to retrieve accurate data on vacancies. A combination of methods might also be considered where "calling only" would be done for certain sizes of establishments and mail plus phone follow-up for the balance, so that calling during the first week would be done for the "call only" population and then in Week 2 for mail non-respondents.

I. Suggestions for Follow-Up Telephone Calling

Companies are asked to respond to the survey within one week of the fourth Monday of the month, and three days after the due date calling begins. After the first week of returned surveys, the responses are matched using the establishment's survey accession number attached to the original sample population file. A random sample of companies that did not respond and did not have mail returned by the post office is generated for telephone interviewing. Follow-up calls to these companies are completed within one and a half weeks of the due date for the survey. Because the date of vacancies for hire that is specified on the survey is one week prior to the due date, it is important that the telephone calls are made as soon as possible. Otherwise the accuracy of the survey will decrease.

A script is provided which should be followed by all telephone interviewers. It may be necessary at times to offer additional information, particularly to large companies, about the purpose of the survey or to suggest that they fax a job list. The most experienced interviewers are assigned to these more difficult companies. Otherwise, the script provides the interviewer with an appropriate outline for making the calls and following up questions.


Sample Script for Follow-Up Calls

"Hello. My name is ______ and I'm calling from UWM. Two weeks ago we sent a survey to your office asking whether you had any jobs open for immediate hire as of Monday, October 19th? All company names and responses are confidential.

"Did you have any job openings two weeks ago on October 19th?"

"Thank you very much for your cooperation."


Calling Codes:
AM - answering machine [Don't leave message]
B - busy
CB - call back
DC - disconnected phone
DK - doesn't know whether any openings
FAX - fax machine
NA - no answer
NP - (changed to) non-published number
RR - refused to respond
WN - wrong number

[Background information on the survey ONLY IF ASKED: "We conduct this survey for the City of Milwaukee, Milwaukee Area Technical College, Milwaukee Public Schools and Private Industry Council of Milwaukee County to help prepare workers for jobs available. For example, MATC uses the employer information to help determine whether to expand technical training in areas where companies need workers."]


Each interviewer receives a call listing that provides the name, address, zipcode, phone number, and size of company. Next to each listing is a place for codes and comments to be made about each call. A code is entered for each attempt made.

Sometimes a company requests that the survey and cover letter be faxed to them. If this is done, an interviewer should note "sent fax" on the call list as well as enter the fax number and contact person's name. If a survey is faxed, the accession number of the business is written on the form so it can be identified when returned. Details are also noted on the call list if specific follow up information is given. For example, if a name and/or time is offered for call back information, this is noted next to the "CB" code. If an alternative number is offered for later contact, this is also written on the call sheet by the interviewer. If a different interviewer follows up or finishes another's list, he/she can determine from the codes what needs to be done next. The codes are later used to determine non-responses, dead companies, and responses.

If a response is received, the interviewer writes down the number of employees and the number of openings for the firm on the call list. A one-page survey form is completed only if the company had job openings on the specified date.

Troubleshooting

J. Using Internet Listings of Job Vacancies

It is especially important to obtain information from large employers in the region when conducting job vacancy surveys. As noted, however, it can be difficult to reach an HR person in a large company who is willing or able to take the time to respond to a survey. One way around this obstacle is to use the World Wide Web, or Internet, to obtain the necessary job openings data for the company. Some commercial establishment data sources contain websites for companies, which can make the search for websites much easier.

Most company websites will not have the level of detail necessary for use in a vacancy survey. (Of 37 Milwaukee area companies whose websites were checked, five had what appeared to be usable vacancy listings.) Establishment website employment listings should be examined carefully to determine whether the information provided is adequate. Particularly useful are listings for companies which include multi-site establishments. Website listings usually identify job titles, worksite location, education and training requirements, and availability of fringe benefits. The data most often omitted from website listings are the number of openings for each job posted, the rate of pay, and whether the position is considered "difficult to fill." If the pay rate is missing, listings can still be used with the pay rate response omitted. For companies that do not detail the number of openings, it can be assumed that there is at least one opening for each position listed (or at least two openings where the plural is used). These assumptions may, however, result in an undercount of the currently available job openings in a region. In the case of retail and service jobs (like clerks for a chain of department stores or delivery company employees), it may not appropriate to assume only two openings for a listing that reads "Store Clerks" or "Warehouse Workers." It is probable that numerous positions are available and the undercount may be too high if only two openings are assumed to exist. In many cases the company must be contacted for further detail about the openings listed.

Website job listings tend to be incomplete for many companies. Often, only positions for skilled employees are listed and vacancies in areas like maintenance or office support (i.e. clerical assistants) are excluded. In some cases the website is used only as a recruiting tool for more difficult to fill positions. All listings should be reviewed thoroughly to be reasonably certain that employment postings are inclusive of all types of jobs available at the company. Also, the heading and/or introduction for the employment webpage should be reviewed for indications whether listings are only for full-time or professional positions and not an all-inclusive job vacancy listing for the company. The Internet may be used for job vacancy information in a survey if the company has verified the accuracy of using such a listing and the information posted has been scrutinized to assess its completeness.

K. Data Entry and Coding

The initial mail response in the first week after the survey is mailed accounts for most (75 - 80 percent) of the total returned surveys. Each survey is sorted by whether or not the establishment had openings. The unique accession number assigned to each survey is immediately entered and matched with the sample list to determine the remaining population targeted for telephone follow-up. Surveys with openings are then forwarded to the data entry staff. As telephone calling begins and mail returns continue, the sub-sample designated for calling is amended daily to avoid unnecessary follow-up calls, and returned surveys continue to be forwarded for data entry.

As calling concludes, verification of data entry begins and each job listing is assigned an occupational code. In the initial design of the Milwaukee area survey, U.S. Decennial Census Occupation codes were chosen to allow comparison with the 1990 Equal Employment Opportunities Commission census file used to drive affirmative action planning and related 1990 demographic reports. There are 511 occupational classifications in 13 major groupings (see Appendix III) using the 1980 Standard Occupational Classification (SOC) codes as the base. In some cases SOC codes are collapsed. However, in most cases the same single listing as found in the SOC codes is used.

Assigning the census occupational code to job listings based on company job titles rather than requiring the employer to determine the code decreases the burden on employers but makes classification difficult for some occupations. The SOC manual is used to look up job titles not found in the abridged U.S. Census listing. Emerging occupations are categorized into codes with the closest match. Hourly rates of pay, education/training pre-requisites and the SIC code of the company are used to improve the code assignments. For example, a "manager" of a fast food franchise earning just above minimum wage would be properly put into the classification of "supervisors, food preparation and service occupations" rather than "managers, food serving and lodging establishments." Similarly, a lower paid "personal banking manager" would be classified as "securities and financial services sales occupation" and not the "financial managers" classification under the executive and management group.

Coding job titles into occupational classifications permits the analysis of high demand and difficult-to-fill vacancies as well as providing background data for career education materials on job availability and rates of pay. Emerging occupations particularly in the medical and computer fields are used by the local technical college to get a better picture of the type of jobs employers are reporting compared to occupational training programs at the college. In all cases the confidentiality of individual establishments is protected. Information on individual companies is never shared nor are data or combinations of variables which would allow identification of job openings for an individual employer.

Once occupational codes are assigned, variables are created which classify occupations by level of education/training and experience required as detailed on the survey form. These include jobs requiring 1) a four-year college degree or more, 2) community college degree or diploma, 3) certification or licensing, 4) high school completion, 5) a commercial driver's license or driver's license (CDL/DL), or 5) prior occupational experience. These categories allow analysis of demand based on the type of education, training or experience required as well as identification of entry-level jobs with no education or training requirements.

After variable coding has been completed, the file of openings is examined for internal consistency and accuracy. The assignment of census occupational codes is re-examined by a sorting of codes in rank order, sorting by salary, and then a sorting by education variables to make certain appropriate coding decisions have been made. The number of openings per establishment is also examined by reported size of employer to identify three types of errors: 1) some employers mistakenly report the job titles of their employees rather than of their vacancies; 2) some establishments report vacancies for sites outside of the four-county labor market; 3) some multi-site establishments (e.g., financial institutions) report employment for all locations on one form instead of on each of the multi-site forms. The establishment size reported in the sample file is used together with the number of employees listed on the survey form to check on the accuracy of both entries. The ES-202 file consistently has errors on the size of employment, and it is critical for weighting purposes that large establishments with many opening are weighted with like-sized establishments and types of businesses. Data on the number of openings and size of company reported on the returned survey form are matched with the original ES-202 sample file to identify reporting inconsistencies and to make corrections where appropriate.

L. Identifying Dead Companies

For establishments with undelivered mail surveys or disconnected telephone numbers, every effort is made to determine whether that establishment in fact is no longer in business. Both the yellow and white pages are checked and size of the establishment considered to obtain an alternate address or phone number. Where none can be found, the establishment is considered to be no longer in business and is recorded as an establishment with no job openings.

M. Sample Selection

A detailed discussion of statistical methods used in establishment surveys can be found in Statistical Policy Working Papers prepared by the Office of Management and Budget's Federal Committee on Statistical Methodology. Paper #15 on the "Measurement of Quality in Establishment Survey" profiles current practices used in federal agencies related to survey quality. Sample design estimation, survey methods, operations and measurement error are discussed and provide an excellent resource for practitioners. Working Paper #17 on "Survey Coverage" provides guidance on how to assess and improve coverage taking advantage of current practices used by federal agencies. These and other papers can be found on the Internet. (See www.bts.gov/fcsm/methodology.)

The sampling methodology used for establishment surveys conducted by the Department of Labor is detailed in the "Bureau of Labor Statistics (BLS) Handbook of Methods." The Occupational Employment Statistics (OES) survey uses the ES-202 files together with supplementary data for non-covered establishments. All establishments with 250 or more employees are sampled and those establishments with 5 to 250 employees, according to USDOL, "are sampled with probability proportional to the size class within each three-digit industry. Establishments employing four or fewer employees (i.e., size class 1 establishments) are not sampled. Instead, the employment for these establishments are accounted for by assigning a larger sampling weight to establishments employing five to nine employees." (See www.bls.gov/opub/hom/homhome.htm.) Response rates are typically over 75 percent for the OES survey.

The Job Openings Pilot Project conducted in 1979 also used the ES-202 file but drew its sample based on all establishments in the ES-202 file including those with no employees as of the first quarter. It could then capture seasonal employment in smaller establishments (e.g., landscaping, construction) which might have openings in other quarters than the first quarter when there are typically seasonal downturns.

Including these smaller establishments in the sample may increase the number of surveys considerably. While these smaller establishments make up a relatively small share of total employment, they may account for over half of the establishments listed in the ES-202 file. (See County Business Patterns for an approximation of the number of companies in the ES-202 file by size and SIC code.) These smaller firms are less likely to have openings than the larger establishments but still account for 10 to 20 percent of all vacancies due to the large number of establishments in the group.

When results are needed for planning purposes at an occupational level in local labor markets, error rates need to be considered in determining a sample size. When the sample-to-universe ratio and response rates result in large weights, the standard error may be so high as to eliminate the usefulness of the cell. For example, in the Job Openings Pilot Project some cells had weights in excess of 500 for smaller establishments with less than 50 employees. If some respondents had more than one opening, the openings would be multiplied against these large weights. Given that only 10 to 20 percent of respondents in the smallest firms have openings, a cell could easily have one establishment in an industrial sector and size category accounting for the total occupational estimate for the cell. (For a complete discussion of estimation and weighting issues in the JOPP study, see Appendix V.)

Even if the error rate is within an acceptable level, the weight times the number of reported openings may result in estimates unrealistic for an industrial sector of the local labor market. The decision to go with a large sample in the Milwaukee survey was based in part on the goal that weights be kept as small as possible to avoid any concern that one or two cells might be distorting the estimates. The Employment and Training Institute's experience with detailed occupational data in the 1990 U.S. Census Public Use Microdata Samples (PUMS) files highlights the standard errors which may result from using weighted sample data at the occupational level. The computerized PUMS files, available for geographic areas of 100,000 or more, allow detailed analysis of census data using all responses on the survey of housing and population. They provide a housing and person record, each of which are given a weight which can be used to estimate the population being considered. Weights in many occupational cells resulted in unacceptably high error rates. In addition, some records had high weights which when combined with a small number of records per cell resulted in higher error rates. For example, three respondents with a combined weight of 150 had a standard error of 64 (or 43 percent). The average weight for Milwaukee area PUMS records was 33, with 94 percent of all weights below 50.

N. Weighting

In order to estimate the number of openings for the total population, weights need to be constructed that are sensitive to establishment size and type of industry. Stratification by one- digit SIC and 10 levels of employee size ranges results in 100 cells. For each cell the total population divided by the sample results in the original weight for the cell. For example, if there were 4,100 retail establishments with 1 to 4 employees and a sample population of 500, the original weight would be 8.2.

Adjustments for those establishments which do not respond increases the weight by applying the non-response adjustment factor in each cell against the sampling weight of the corresponding cell. So that if 177 establishments out of 500 establishments responded (35.4 percent) and the original weight is 8.2, then the adjusted weight would be 23.2 for retail establishments of 1 to 4 employees.

The resulting weights are then be examined for extreme values which typically would result from cells with low response rates. These rates when multiplied against total full-time and part-time openings for a cell could distort findings on the number and types of jobs which are estimated to be vacant. A Chi-square test is conducted on each weighting cell by both size of establishment and one-digit SIC code; those responses more than one standard deviation above expected are excluded from the analysis.

The resulting weight for the cell is then attached to each occupational record in the cell and multiplied against the number of full-time openings and part-time openings to derive an estimate of full-time and part-time vacancies which is the primary measure for analysis. Each record in each cell then has a weight and estimate corresponding to that cell, excluding outliers found using the Chi-square test. These estimates are summed to obtain higher level estimates. For example, full-time and part-time retail openings are the result of summing all weighted full-time and part-time openings across all employment classes for retail.

In some cases establishments provide data on the number of openings but not on hourly wage or salary. All available wage data is converted to hourly wage rates using 2,040 work hours per year for annual and monthly salaries. The lower end of a range is used when salary depends on qualifications and experience. However, some employers enter "base salary plus commission", "commission", "depends" or do not respond. In these cases separate weights are constructed for those who respond with wage data using the same methodology as above and substituting the number of respondents with wage data for the number responding to adjust for non-response on the wage variable by increasing weights for each cell accordingly.

Similarly, some establishments do not provide usable data on fringe benefits, difficult-to-fill status or experience questions, and non-response to these questions needs to be factored into a separate weight for cells where responses to these variables are missing. The most common cases where this happens are where large establishments provide the number of vacancies and FTE's for occupations using their weekly posting sheets rather than filling out the survey questions completely.

As a result of the missing data, three weights are provided for each full-time and part-time occupational listing resulting in six possible weighted estimates.


References

Abraham, Katharine. "Structural/Frictional vs. Deficient Demand Unemployment: Some New Evidence," American Economic Review 73 (September 1983).

__________. "Help-Wanted Advertising, Job Vacancies, and Unemployment, Brookings Papers on Economic Activity (1987) 1.

Australian Bureau of Statistics. Survey of Labour Demand (SOLD) Feasibility Study Report. Canberra: Labour Branch, 1991.

Frumerman, Harry. Job Vacancy Statistics in the United State. Washington, D.C.: National Commission on Employment and Unemployment Statistics, May 1978.

Holzer, Harry J. Unemployment, Vacancies and Local Labor Markets. Kalamazoo, Mich.: W.E. Upjohn Institute for Employment Research, 1989.

__________. What Employers Want: Job Prospects for Less-Educated Workers. New York: Russell Sage Foundation, 1996.

Levitan, Sar and Frank Gallo. "Workforce Statistics: Do We Know What We Think We Know -- And What Should We Do?" U.S. Congress, Joint Economic Committee, December 1989.

Myers, John G. and Daniel Creamer. Measuring Job Vacancies: A Feasibility Study in the Rochester, New York Area. New York: The Conference Board, 1967.

National Board of Economic Research. The Measurement and Interpretation of Job Vacancies. New York: Columbia University Press, 1966.

National Commission on Employment and Unemployment Statistics. Counting the Labor Force. Washington, D.C.: U.S. Government Printing Office, 1979.

__________. Counting the Labor Force Appendix Volume I: Concepts and Data Needs. Washington, D.C.: U.S. Government Printing Office, 1979.

Plunkert, Lois. "Job Openings Pilot Program: Final Report." Washington, D.C.: U.S. Department of Labor, Bureau of Labor Statistics, Office of Employment Structure & Trends, March 1981.

Riemer, David Raphael. The Prisoners of Welfare: Liberating America`s Poor from Unemployment and Low Wages. New York: Praeger Publishers, 1988.

U.S. Bureau of the Census. STF3, 1990. Washington, D.C.: author, May 1992.

__________. General Social and Economic Characteristics, Wisconsin: 1980 U.S. Census. August 1983.

U.S. Department of Labor. Employee Turnover and Job Openings Survey: Results of a Pilot Study on the Feasibility of Collecting Measures of Imbalances of Supply and Demand for Labor in an Establishment Survey. Washington, D.C.: Bureau of Labor Statistics, 1991.

Wisconsin Department of Industry, Labor and Human Relations. Wisconsin Experimental Job Vacancy Project: A Comparative Study of ES Job Openings and JOLTS Job Vacancies. Madison, Wis.: author, October 1974.


Background on the Employment and Training Institute, University of Wisconsin-Milwaukee

Summaries of Employment and Training Institute Job Openings Surveys

TO TOP