In this post, we present a Google-Chrome AI-generated summary on what a semi-repetitive process is, and related major references (some minor editing applied, as deemed necessary):
What is a semi-repetitive process?
A semi-repetitive process is a production or operational cycle that lies between repetitive (constant work content) and non-repetitive (memoryless) processes. It has a characteristic, recognizable structure (mode) but includes random, partial variations in work content between cycles. Such processes are partially predictable, with variations often caused by changes in “process identity”.
Key Usage Examples
- Medical Procedures: Open-heart surgeries are semi-repetitive; while they have a common core, the number of bypasses varies between patients.
- Manufacturing: Assembly lines that are easily stopped, started, or adjusted for small batches, often termed semi-continuous.
- Business Operations: Administrative tasks like order entry, bank reconciliation, and handling customer calls in a call center.
- Industrial Processes: Semi-continuous pyrolysis used for bio-oil production, which improves upon batch inconsistencies.
- Robotics: Robot pick-and-place tasks where the robot repeats a cycle (collecting a payload, transferring it, returning) until a shutdown command.
Characteristics
- Variable Duration: While a “typical” time exists (mode), it is not necessarily the average time.
- Predictability: Unlike fully random processes, semi-repetitive processes allow for estimations of time and resources, though these become less reliable as variability increases.
- Process Repetitiveness Measure (PRM): A tool to measure how much a process is semi-repetitive, based on the distance between the mode and mean, often negatively correlating with the Coefficient of Variation (CV).
Major References
Haim Shore, a prominent researcher in quality and reliability engineering, is the primary source for the modern academic definition and modeling of semi-repetitive processes. His work establishes a framework for analyzing processes that fall between strictly repetitive and completely random (memoryless) states. A key academic reference to his work is the four-part series (published February, 2026):
Part 1: Engineering Implications of Semi-Repetitive Processes
Part 2: The Dual-Component Variation of Semi-Repetitive Processes
Part 3: Measuring Repetitiveness of Semi-Repetitive Processes
Part 4: Reliability of Process-Time Prediction for Semi-Repetitive Processes
Other Key Academic References
The following works also provide core definitions, mathematical modeling, and engineering implications:
- Shore, H. (2024): “A novel approach to modeling steady-state process-time with smooth transition from repetitive to semi-repetitive to non-repetitive (memoryless) processes” — Published in Quality and Reliability Engineering International, this is the most comprehensive recent reference. It introduces the Random-identity Paradigm, viewing work-content variation as a change in “process identity”.
- Shore, H. (2021): “SPC scheme to monitor surgery duration” — Published in Quality and Reliability Engineering International, this work applies semi-repetitive modeling to medical procedures, a classic example of the concept.
- Shore, H. (2021): “Estimating operating room utilisation rate for differently distributed surgery times” — Found in the International Journal of Production Research, it examines how semi-repetitive variability affects resource planning.
- Shore, H. (2020): “An explanatory bi-variate model for surgery-duration and its empirical validation” — Published in Communications in Statistics, this provides the empirical foundation for treating surgery as a semi-repetitive process.
Foundational & Supporting Works
While Shore formalized the “semi-repetitive” term, these references provide the broader context of process variability and manufacturing:
- Manufacturing Classification: The distinction between repetitive and intermittent (job-shop) systems is traditionally referenced back to De Toni & Panizzolo (1995) in their comparative study of manufacturing characteristics.
- Statistical Theory: For the underlying distribution theory (such as why the mode departs from the mean in these processes), Shore cites Stuart & Ord (1987), Kendall’s Advanced Theory of Statistics.
- Automation Context: Modern references like Hosseini et al. (2015) discuss surgical duration estimation using data mining, which aligns with the need to predict semi-repetitive process outcomes.
Quantifying how “repetitive” a process is – the Process Repetitiveness Measure (PRM)
PRM is a statistical metric developed by Haim Shore to quantify how closely a process adheres to a “typical” repeatable cycle. It serves as a bridge between strictly repetitive systems (where work content is constant) and non-repetitive ones (where there is no discernible typical state).
Conceptual Basis: Mode-Mean Departure
The fundamental logic of PRM is that in a repetitive process, the distribution of completion times is highly concentrated around a single value (the mode). As a process becomes semi-repetitive or non-repetitive, the “typical” time (mode) drifts further away from the average time (mean) due to increasing random variations in work content. Refer to Why the mode departs from the mean ‒ A short communication.
Metric Definition
PRM is defined as the standardized distance between the mode and the mean of the process-time distribution.
Mathematical Components
It is typically expressed in terms of the first four statistical moments (mean, variance, skewness, and kurtosis). A practical proxy is the Coefficient of Variation (CV). Shore’s research demonstrates that for many practical applications, the Coefficient of Variation (CV)—the ratio of the standard deviation to the mean—is a reliable and simpler alternative to PRM. PRM and 1-CV are often positively linearly correlated. For specific distributions, like the Gamma distribution (commonly used for surgery times), they are mathematically equivalent. Using 1-CV is often preferred because estimating the 3rd and 4th moments, required for a full PRM calculation, involves larger standard errors and requires significantly more data.
Application in Predictability
The PRM is used to determine the predictability threshold of a process.
Predictable: High PRM (low CV) indicates a strong “process identity”, where completion times can be accurately forecasted (taking into account random error variation).
Unacceptable: When PRM drops below a certain statistical criterion, the process ceases to be predictable, meaning management can no longer rely on time estimates for scheduling (e.g., in operating room utilization).
Summary Table of Process Types
| Process Type | Work Content | PRM / CV Level | Predictability |
| Repetitive | Constant | Very High PRM / Low CV | Certain (apart from error) |
| Semi-Repetitive | Partially Variable | Moderate PRM / CV | Statistical / Probabilistic |
| Non-Repetitive | Memoryless (no typical work-content) | Low PRM / High CV | Impossible |