David007
was active Active 6 months, 2 weeks agoActivity

I’m familiar with the paper and, for the first time, agree with you Fausto. There are real concerns about the misuse of pvalues, but that doesn’t mean we should stop using them. 12 months ago

You can have R free but it’s not cheap, you will need to spend significant time learning. 1 year, 1 month ago

Here is a good blog post/video that explains d2 and d3 using simulation:
https://andrewmilivojevich.com/d2constantd3constant/
Disclaimer: I’m not affiliated with the blogger or Minitab

Thanks Robert. Good assessment. 1 year, 9 months ago

Robert, it’s funny you should mention extreme religious movements because that’s exactly what I thought he was doing. 1 year, 9 months ago

Look, if you said something like parameter estimation issues due to small sample size, or the issues with transformation (debated extensively in the past – Wheeler versus Breyfogle), we could have a civil conversation/debate. But you are just trying to sell your book. 1 year, 9 months ago

The burden of proof is on the one making the claim. I was being satirical because your Six Sigma Hoax article was published in Nuclear Science by Science Publishing Group, exposed by Beal as predatory publishers. Your citing everything you have ever published is obvious spamming. It’s also pretty obvious that you’re just trying to sell your…[Read more]

Well you can always publish in the prestigious journal Nuclear Science, along with your spam practice of citing every paper that you have ever published.

For the rounded data problem, an omnibus skewness kurtosis test such as Doornik Hansen (or Jarque Bera) is useful. 1 year, 9 months ago

Fausto runs an IMR chart on exponentially distributed data and declares the process is out of control. He must love those type I errors.
Fausto’s students search in vain for an assignable cause because there is none to be found because the process is inherently exponential.
Fausto shouts from the rooftop “Minitab is wrong” “Six Sigma is a hoa…[Read more]

I look forward to seeing your paper in QE. 1 year, 10 months ago

As others have said, I anxiously await to see your rebuttal paper in any of Quality Engineering, Journal of Quality Technology, Technometrics, Journal of Applied Statistics, etc. 1 year, 10 months ago

Minitab tech support is not needed. Calculate the scale (Exponential) or scale and shape (Weibull) using maximum likelihood. Compute .00135 and 99.865 percentiles. 1 year, 10 months ago

Post a white paper to prove your point. 1 year, 10 months ago

> With millions of simulations one can find that the MINITAB T CHARTS are WRONG 93.3% of “applications”
Sure if the underlying distribution is something other than an exponential (or Weibull) distribution, say an application with a lognormal or loglogistic distribution. That’s why we do distribution identification with probability plo…[Read more]

If any theory coming from process expertise is available then, of course, use it. In the examples discussed we were given that it was an exponential distribution, so control limits based on an exponential distribution should be used.
Generally however, with n=20, any distribution identification is going to be an approximation. So the model is…[Read more]

Fyi, we always tell people that if a process is nonnormal due to outliers then deal with those, don’t use transformations or a Nonnormal distribution.
However the processes being discussed here are inherently non normal with an exponential distribution.
Effectively by defining a process like this as “outofcontrol” you are sending your clien…[Read more]

You really should play with some simulated exponential data. You’ll be surprised by what you see as “inherent variation”. 1 year, 10 months ago

According to the Distribution ID method, we should be using a 2 parameter exponential or 3 parameter Weibull anyway. But as I’ve repeatedly said with n=20 the model is going to be wrong anyway. 1 year, 10 months ago
 Load More