Appendix A
Other Shipping Environments
While the U.S. shipping environment is basically the same as in other countries, there are some differencesboth in packaging and transportationwhich make comparison difficult. As noted in the introduction, domestic damage costs for packages in transit are just over U.S.$6 million. However, the total costs for transport damage in western Europe is U.S.$3 billion each year, suggesting that more improvements in protective packaging are necessary. Also, packaging consumption in Japan is nearly double that of North America and Europe, which leads to more packaging concerns there (Packforsk, 1999). In this country break bulk centers, where packages are unloaded, further segregated, and reloaded, are more common than in Europe, promoting much more manual handling of packages. The average length of haul is only 150 miles in Europe, compared to 500 miles in the U.S. ("An Evaluation of Current Trends in Thought Concerning Carrier Packaging Regulations," J. Stone, in Distribution Packaging Technology, R. Fiedler, 1995).
Measurement of European Environment
Dr. Thomas Trost of Packforsk (Swedish Packaging Research Institute) is heading up an EU technical committee called "Source Reduction by European Testing Schedules" (SRETS). The objective of SRETS is twofold:
The SRETS technical committee hopes that developing these techniques for normal use in the packaging chain will result in packaging specifications meeting minimum cost and material requirements compatible with current environmental issues and an improved awareness of quality transport according to EN 29 000. They believe this will prevent over-testing as well as over-packaging. It should also contribute to the reduction of source materials in packaging and ultimately in the packaging waste stream.
The project is divided into the seven parts listed below. More information about the current status of SRETS can be obtained from Thomas Trost at thomas.trost@packforsk.se or by visiting the Packforsk web site at http://www.packforsk.se.
Measurement of Asian Environment
The China Project is a study sponsored by several companies (Dell, Hewlett-Packard, IBM, Johnson & Johnson, and Kodak), designed to measure the distribution and handling environments in China. A team from Hewlett-Packard performed an initial survey and observation of the distribution channels and shipping and handling methods, which will be combined with objective measurements of shock, vibration, and temperature. The team found the infrastructure somewhat lacking, although there are signs of tremendous growth and development. The damage rates are apparently quite high and overpackaging products seems to be a common remedy for the problem. There is a lack of basic material handling equipment, such as forklifts and loading docks, and there is little unitized or palletized load shipment. Most packages are shipped and handled individually, increasing the likelihood of tosses and severe drops from the top of stacks.
The distribution environment is much more labor-intensive than in the U.S. and the numerous manual handlings raise the chances of accidental drops and increased drop frequency. China has a wide range of vehicles, from trucks to bicycles, which deliver packages and various road types, many of which are in poor condition. Also, packages often suffer from hazards due to climatic exposure from open trucks and other vehicles.
The China Project is scheduled to complete fieldwork by the end of 1999. Further information about its progress can be obtained from Dennis Young of DYA, Inc. at dyainc@dyainc.com or the projects web site at http://www.dyainc.com/china/.
Appendix B
MADE Time Line
Period |
Activity |
Aug. 96 | Initial Meeting / Form Committees |
Sept. to Dec. 96 | Develop Draft Test Plan and Protocol / Evaluate and Calibrate Recorders |
Jan. to Feb. 97 | Obtain Group Agreement on Test Plan and Protocol |
Mar. to May 97 | Start Alpha Data Collection / Fine Tune Recorder Parameters |
Jun. to Aug. 97 | Start Overall Analysis of Data / Fine Tune Test Plan and Protocol |
Sept. 97 | Present Findings to Group / Decide Future Study (Beta) |
Feb. to Apr. 98 | Finalize Alpha / Pre-launch Beta |
Nov. 98 | Continue Beta Collection / Update Group |
Feb. to May 99 | Finalize Beta / Continue Data Analysis |
Jun. to Aug. 99 | Write Wrap-up Report |
Appendix C
Alpha Phase Report
MEASUREMENT AND ANALYSIS OF DISTRIBUTION ENVIRONMENTS
Paul Russell, HP Corporate Packaging Program
Jorge Marcondes, San Jose State University
INTRODUCTION
A joint task group on transport environment measurement has been formed among members of organizations interested in measuring distribution hazards and packaging professionals (International Transit Association and Institute of packaging Professionals). The task group is based on open participation and its scope is to establish an open-access storage center for collected data. Their objectives are to support and encourage transport environment measurement and access, develop data collection guidelines, develop data format guidelines, allow multi-media data access, and to interface with constituents.
The task group is interested in collecting several categories of information, such as dynamic (shock and vibration), atmospheric (temperature, humidity and pressure), and other systematic or non-systematic hazards. Five committees have been established to implement the project: administration, test plan, equipment, statistical, and analysis. The project was divided in two phases: phase alpha, with the objective of developing and validating guidelines for data collection, and beta, to actually collect data to support the database.
WHY MEASURE THE DISTRIBUTION ENVIRONMENT
Existing information on distribution hazards is controversial both in nature and intensity. Even standards for performance testing of transport packaging differ significantly. In addition to constantly changing conditions (data obtained last year may not be valid for todays environment), there are now more reliable measuring devices. The current technology allows data recording, digital storage, and manipulation of large amounts of data in much easier ways than in the past.
COMMITTEE PARTICIPATION
Table 1. MADE committees
MADE Committee | Responsibilities |
Administration | logistics and process flow |
Test Plan | development of test plan |
Equipment | data recorder fixture and calibration, data transfer, and equipment availability |
Statistical | sample size, variables, confidence level |
Analysis | data transfer, validity checks, and reporting formats |
Forty-six companies including packaging manufacturers, packaging users, transport companies, equipment manufacturers, testing laboratories, and consulting firms affiliated with the task group at its inception. Most companies participate in one or more committees. The responsibilities of each committee are shown in Table 1.
PHASE ALPHA
This phase was performed with the objective of establishing and validating guidelines for data collection and analysis. Eight shipments (round-trip) were measured for impacts and temperature using self-contained data recorders. Shipments covered distances across the USA, between companies in the west and the east coasts. Three recording units were mounted on wood blocks (two units to measure shock and temperature, and one unit to measure only temperature). The weight of the total assembly was 20 lb. The recording units were hard mounted on the wood structure to simulate a typical shipment within the high tech industry. Figure 1 shows a picture with the two recording units in place.
Figure 1. Recorders mounted on a wood structure for phase alpha.
The recording units were set to record impacts and temperature. The wood structure containing the recorders was cushioned with corner pads and placed in a corrugated fiberboard box to be sent via small parcel distribution systems. The recording units were set to record the most serious 100 events during the entire trip, and temperature every 30 minutes. Before shipment, the entire assembly was subjected to calibration by performing free fall tests from 12, 24, 36 and 48 inches, on faces, edges and corner of the package, and comparing the reading results with the actual drops. In situations when a unit reported an event with less than 95% accuracy, a correction factor was established to adjust the results obtained from the field. All three units tested showed satisfactory performance for phase alpha.
Currently, the task group is analyzing the results of phase alpha and refining the process for data collection, analysis and reporting. Phase beta will initially include shipments within the USA.
RESULTS OF PHASE ALPHA
a. Trip details
Phase alpha consisted of eight round trips, with the following trip codes and other details, as per test plan guidelines (Table 2):
Table 2. Phase alpha details
Trip Code | Sender | Receiver | Carrier (out/return) | Notes |
01051F10 01051U11 01051F12 01051F13 01051U14 HP061F10 HP061F11 HP061F12 |
HP San Diego HP San Diego HP San Diego HP San Diego HP San Diego HP Boise HP Boise HP Boise |
HP Andover HP Andover HP Andover HP Andover HP Andover IBM-RTP, NC IBM-RTP, NC IBM-RTP, NC |
FedEx/FedEx UPS/UPS FedEx/FedEx FedEx/FedEx UPS/UPS FedEx/Airborne FedEx/Airborne FedEx/Airborne |
(a) (b) (c) (d) (d) (d) (e) (f) |
(a) At HP-San Diego, someone along the way put "fragile" stickers on all 4 sides of the box.
(b) Data from first half of trip (San Diego to Andover) is invalid because the unit was not properly packed.
(c) Was mistakenly returned via UPS Ground on return trip
(d) FedEx service was unavailable from IBM Research Triangle Park, NC, so Airborne Express was used instead.
(e) Error reading acceleration data from EDR3 - only temperature data was recorded on disk and sent for analysis
(f) One week delay in return due to UPS strike
b. How the analysis was made
At the end of each trip, raw data files were send to SJSU for analysis. The files were analyzed using the software provided by the data recorders manufacturers (Lansmont and IST), along with recommended software setup. After the initial analysis of all data files at SJSU, it was noticed some differences in the results. Lansmont and IST agreed to do the analysis of the files separately, therefore the respective raw data files, along with details of the package used were sent to them.
Some errors occurred in reading or processing files (two files, one from EDR3 and one from SAVER) presented errors and their data could not be analyzed:
HP061F10 for SAVER - at the end of the processing, it returns a "General Error" fault and then reports that there are no events in the file
HP061F12 for EDR3 - when uploading data from unit to computer, it showed an error in reading acceleration data from unit, and only temperature data was recorded.
c. "e values used for analysis
Before the units were sent out for data acquisition, the package system was calibrated at SJSU (as per test plan). The raw data files were also sent to IST and Lansmont, separately. The following are the "e" values they found in the calibration files and used in the analysis of data: SAVER, according to Table 3. For EDR3, values of "e" were determined as 0.375, 0.47, 0.43, and 0.52. The average was 0.44.
Table 3. SAVER "e" (coefficient of restitution) values
Drop height (inches) | Orientation | e |
12 | Flat Edge Corner |
0.43 0.43 0.35 |
24 | Flat Edge Corner |
0.5 0.48 0.41 |
36 | Flat Edge Corner |
0.53 0.5 0.43 |
48 | Flat Edge Corner |
0.57 0.5 0.45 |
d - How the shock comparison tables were constructed and what they show:
Initially, for each pair of files to be compared, the events that had equivalent drop height above 10 inches were selected, and listed for each unit. The events date and time were used as a basis for comparison.
The clocks disagreed slightly (the EDR clock was slightly faster than the SAVER, for the recording units used in the phase: two SAVERs and two EDR3s). No study was done to determine which one was more accurate. This issue was important for phase alpha because it was the basis for event comparison, but it should not be a problem for phase Beta.
Most events observed were not free fall. They were impacts, in basically all orientations. The highest equivalent drop height found in all six round trips was 38.8 inches.
There was still some disagreement between the results given by the units. Most of these were found to be caused by the fact that the analysis was being performed in different parts of the same event. This occurred because the "window" recorded did not always agree completely between units (although the units were triggered by the same event, the recorded waveforms started and ended at slightly different times, sometimes missing important parts of an event). This issue will be addressed in phase beta by extending the recording window well beyond the triggering time, and by minimizing the dead time (time just after one event, when unit does not record the waveform - or "event storm").
e. Temperature results
There were no significant differences between temperature recorded in all trips. Temperature values ranged between 70oF and 95oF.
PHASE BETA
a. How data is to be presented
If a large number of replications of same trip, package size, etc. will be conducted, than it may be better to present results in a statistical distribution format. If conditions will change from trip to trip, it is more appropriate to present actual results (example, events higher than 10 inches, 5 largest drops, etc.). The user can then determine what statistical analysis is more appropriate depending on the applications.
b. Package and trip documentation form:
Minor changes are required: 1) change from "shipper" to "sender", 2) include date and time of turn around (or make two forms, one for each leg of the trip)
c. Recording unit orientation:
Units should be positioned in the package according to their designated orientation, no matter what orientation the axes are (the software is set to report impact direction as per the units designated orientation). This was an issue in phase alpha because the units were side by side, but should not be a problem for phase Beta.
d. Database
Results of phase beta will be added to the database as they are collected and analyzed. After a certain amount of data has been collected, several statistical analyses will be performed on the data to better understand its distribution and the influence of controllable variables such as package size and weight.
The goal of the task force is to provide a web site where users would log on and request information by entering some basic information about their products and packages, such as:
and information about the following parameters would be retrieved from the database:
CONCLUSIONS
The measurement and analysis of distribution environments aims to provide packaging designers with accurate and updated information about distribution hazards, such as drops, impacts, humidity and temperature. The task group is a joint effort between ISTA and IoPP and has open participation. Companies willing to join the task group can do so by contacting the authors. Members can participate in the study by contributing with engineering time, loaning data recorders, and sharing information.
Appendix D - Test Plan for Beta
Appendix E
MADE Beta Results ( betaresults.xls - 1032Kb)
This section is related to the MADE Beta study and consist of 4 parts:
Appendix F
The MADE team performed the following study to determine the cause of discrepancies found in the alpha phase results.
Appendix G
Ergonomics Formula
Maximum weight vs. reach for infrequent lifts
BASED ON THE FORMULA:
AL (lbs.) BASED ON VARYING H WITH V, D, F, and FMAX HELD CONSTANT
H (HORIZONTAL DISTANCE) in inches
V (VERTICAL DISTANCE) in inches
D (VERTICAL CHANGE) in inches
F (FREQUENCY) in lifts/minute
Fmax (GIVEN) in lifts/minute observed
Source: New Approaches to Defining the Distribution Environment, J. Daniels and R. Sanders
Appendix H
References in Industry
Alfred H. McKinlay 432 McDougall Road Pattersonville, NY 12137 Phone: (518) 887-5656 Email: almckinlay@aol.com |
S. Paul Singh Michigan State University School of Packaging East Lansing, MI 48824 Phone: (517) 355-7614 Fax: (517) 353-8999 Email: singh@pilot.msu.edu |
Stephen R. Pierce Eastman Kodak Co. PEGD/153, 1/205 Rochester, NY 14650 Phone: (716) 477-4483 Fax: (716) 588-6985 |
Wayne Tustin Equipment Reliability Institute 1520 Santa Rosa Avenue Santa Barbara, CA 93109 Phone: (805) 564-1260 Fax: (805) 564-1260 Email: tustin@equipment-reliabilty.com |
Herb Schueneman Westpak, Inc. 134 Martinvale Lane San Jose, CA 95119 Phone: (408) 224-1300 Fax: (408) 224-5113 Email: herbwp@aol.com |
Dennis Young Dennis Young & Associates, Inc. 240 Front Avenue SW Suite 219 Grand Rapids, MI 49504 Phone: (616) 459-9700 Fax: (616) 459-5011 Email: dyainc@dyainc.com |