Skip to main content

Table 3 TRIPOD adherence of included studies

From: An updated systematic review of radiomics in osteosarcoma: utilizing CLAIM to adapt the increasing trend of deep learning application in radiomics

37 Selected items in 22 criteria according to 7 sections (N = 29)

Study, n (%)

Overall (excluding items 5c, 11, 14b, 10c, 10e, 12, 13, 17, and 19a)

481/812 (59.2)

Section 1: Title and Abstract

18/58 (31.0)

 1. Title—identify developing/validating a model, target population, and the outcome

2/29 (6.9)

 2. Abstract—provide a summary of objectives, study design, setting, participants, sample size, predictors, outcome, statistical analysis, results, and conclusions

16/29 (55.2)

Section 2: Introduction

36/58 (62.1)

 3a. Background—Explain the medical context and rationale for developing/validating the model

29/29 (100.0)

 3b. Objective—Specify the objectives, including whether the study describes the development/validation of the model or both

7/29 (24.1)

Section 3: Methods

218/277 (57.8)

 4a. Source of data—describe the study design or source of data (randomized trial, cohort, or registry data)

29/29 (100.0)

 4b. Source of data—specify the key dates

29/29 (100.0)

 5a. Participants—specify key elements of the study setting including number and location of centers

29/29 (100.0)

 5b. Participants—describe eligibility criteria for participants (inclusion and exclusion criteria)

22/29 (75.9)

 5c. Participants—give details of treatment received, if relevant (N = 25)

16/25 (64.0)

 6a. Outcome—clearly define the outcome, including how and when assessed

27/29 (93.1)

 6b. Outcome—report any actions to blind assessment of the outcome

3/29 (10.3)

 7a. Predictors—clearly define all predictors, including how and when assessed

10/29 (34.5)

 7b. Predictors—report any actions to blind assessment of predictors for the outcome and other predictors

4/29 (13.8)

 8. Sample size—explain how the study size was arrived at

3/29 (10.3)

 9. Missing data—describe how missing data were handled with details of any imputation method

6/29 (20.7)

 10a. Statistical analysis methods—describe how predictors were handled

29/29 (100.0)

 10b. Statistical analysis methods—specify type of model, all model-building procedures (any predictor selection), and method for internal validation

21/29 (72.4)

 10d. Statistical analysis methods—specify all measures used to assess model performance and if relevant, to compare multiple models (discrimination and calibration)

6/29 (20.7)

 11. Risk groups—provide details on how risk groups were created, if done (N = 0)

n/a

Section 4: Results

117/174 (67.2)

 13a. Participants—describe the flow of participants, including the number of participants with and without the outcome. A diagram may be helpful

16/29 (55.2)

 13b. Participants—describe the characteristics of the participants, including the number of participants with missing data for predictors and outcome

26/29 (89.7)

 14a. Model development—specify the number of participants and outcome events in each analysis

23/29 (79.3)

 14b. Model development—report the unadjusted association between each candidate predictor and outcome, if done (N = 5)

4/5 (80.0)

 15a. Model specification—present the full prediction model to allow predictions for individuals (regression coefficients, intercept)

21/29 (72.4)

 15b. Model specification—explain how to the use the prediction model (nomogram, calculator, etc.)

11/29 (37.9)

 16. Model performance—report performance measures (with confidence intervals) for the prediction model

20/29 (69.0)

Section 5: Discussion

86/87 (98.9)

 18. Limitations—Discuss any limitations of the study

28/29 (96.6)

 19b. Interpretation—Give an overall interpretation of the results

29/29 (100.0)

 20. Implications—Discuss the potential clinical use of the model and implications for future research

29/29 (100.0)

Section 6: Other information

6/58 (10.3)

 21. Supplementary information—provide information about the availability of supplementary resources, such as study

0/29 (0.0)

 22. Funding—give the source of funding and the role of the funders for the present study

6/29 (20.7))

Section 7: Validation for Model type 2a, 2b, 3, and 4 (N = 16)

32/64 (50.0)

 10c. Statistical analysis methods—describe how the predictions were calculated

15/16 (93.8)

 10e. Statistical analysis methods—describe any model updating (recalibration), if done (N = 0)

n/a

 12. Development versus validation—Identify any differences from the development data in setting, eligibility criteria, outcome, and predictors

10/16 (62.5)

 13c. Participants (for validation)—show a comparison with the development data of the distribution of important variables

2/16 (12.5)

 17. Model updating—report the results from any model updating, if done (N = 0)

n/a

 19a. Interpretation (for validation)—discuss the results with reference to performance in the development data and any other validation data

5/16 (31.3)

  1. In the cases where a score of one point per item was obtained, the study was considered to have basic adherence to each item. The adherence rate was calculated as proportion of the number of articles with basic adherence to number of total articles. During the calculation, the “if done” or “if relevant” items (5c, 11, and 14b) and validation items (10c, 10e, 12, 13, 17, and 19a) were excluded from both the denominator and numerator
  2. TRIPOD Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis, n/a not applicable