Back to Multiple platform build/check report for BioC 3.21: simplified long |
|
This page was generated on 2025-04-22 13:18 -0400 (Tue, 22 Apr 2025).
Hostname | OS | Arch (*) | R version | Installed pkgs |
---|---|---|---|---|
nebbiolo1 | Linux (Ubuntu 24.04.1 LTS) | x86_64 | 4.5.0 RC (2025-04-04 r88126) -- "How About a Twenty-Six" | 4831 |
palomino7 | Windows Server 2022 Datacenter | x64 | 4.5.0 RC (2025-04-04 r88126 ucrt) -- "How About a Twenty-Six" | 4573 |
lconway | macOS 12.7.1 Monterey | x86_64 | 4.5.0 RC (2025-04-04 r88126) -- "How About a Twenty-Six" | 4599 |
kjohnson3 | macOS 13.7.1 Ventura | arm64 | 4.5.0 RC (2025-04-04 r88126) -- "How About a Twenty-Six" | 4553 |
kunpeng2 | Linux (openEuler 24.03 LTS) | aarch64 | R Under development (unstable) (2025-02-19 r87757) -- "Unsuffered Consequences" | 4570 |
Click on any hostname to see more info about the system (e.g. compilers) (*) as reported by 'uname -p', except on Windows and Mac OS X |
Package 1538/2341 | Hostname | OS / Arch | INSTALL | BUILD | CHECK | BUILD BIN | ||||||||
pathMED 1.0.0 (landing page) Jordi Martorell-Marugán
| nebbiolo1 | Linux (Ubuntu 24.04.1 LTS) / x86_64 | OK | OK | OK | ![]() | ||||||||
palomino7 | Windows Server 2022 Datacenter / x64 | OK | OK | OK | OK | ![]() | ||||||||
lconway | macOS 12.7.1 Monterey / x86_64 | OK | OK | ERROR | OK | |||||||||
kjohnson3 | macOS 13.7.1 Ventura / arm64 | OK | OK | ERROR | OK | |||||||||
kunpeng2 | Linux (openEuler 24.03 LTS) / aarch64 | OK | OK | OK | ||||||||||
To the developers/maintainers of the pathMED package: - Allow up to 24 hours (and sometimes 48 hours) for your latest push to git@git.bioconductor.org:packages/pathMED.git to reflect on this report. See Troubleshooting Build Report for more information. - Use the following Renviron settings to reproduce errors and warnings. - If 'R CMD check' started to fail recently on the Linux builder(s) over a missing dependency, add the missing dependency to 'Suggests:' in your DESCRIPTION file. See Renviron.bioc for more information. |
Package: pathMED |
Version: 1.0.0 |
Command: /Library/Frameworks/R.framework/Resources/bin/R CMD check --install=check:pathMED.install-out.txt --library=/Library/Frameworks/R.framework/Resources/library --no-vignettes --timings pathMED_1.0.0.tar.gz |
StartedAt: 2025-04-21 20:49:46 -0400 (Mon, 21 Apr 2025) |
EndedAt: 2025-04-21 20:51:41 -0400 (Mon, 21 Apr 2025) |
EllapsedTime: 115.3 seconds |
RetCode: 1 |
Status: ERROR |
CheckDir: pathMED.Rcheck |
Warnings: NA |
############################################################################## ############################################################################## ### ### Running command: ### ### /Library/Frameworks/R.framework/Resources/bin/R CMD check --install=check:pathMED.install-out.txt --library=/Library/Frameworks/R.framework/Resources/library --no-vignettes --timings pathMED_1.0.0.tar.gz ### ############################################################################## ############################################################################## * using log directory ‘/Users/biocbuild/bbs-3.21-bioc/meat/pathMED.Rcheck’ * using R version 4.5.0 RC (2025-04-04 r88126) * using platform: aarch64-apple-darwin20 * R was compiled by Apple clang version 14.0.0 (clang-1400.0.29.202) GNU Fortran (GCC) 14.2.0 * running under: macOS Ventura 13.7.1 * using session charset: UTF-8 * using option ‘--no-vignettes’ * checking for file ‘pathMED/DESCRIPTION’ ... OK * this is package ‘pathMED’ version ‘1.0.0’ * package encoding: UTF-8 * checking package namespace information ... OK * checking package dependencies ... OK * checking if this is a source package ... OK * checking if there is a namespace ... OK * checking for hidden files and directories ... OK * checking for portable file names ... OK * checking for sufficient/correct file permissions ... OK * checking whether package ‘pathMED’ can be installed ... OK * checking installed package size ... INFO installed size is 5.8Mb sub-directories of 1Mb or more: data 4.8Mb * checking package directory ... OK * checking ‘build’ directory ... OK * checking DESCRIPTION meta-information ... OK * checking top-level files ... OK * checking for left-over files ... OK * checking index information ... OK * checking package subdirectories ... OK * checking code files for non-ASCII characters ... OK * checking R files for syntax errors ... OK * checking whether the package can be loaded ... OK * checking whether the package can be loaded with stated dependencies ... OK * checking whether the package can be unloaded cleanly ... OK * checking whether the namespace can be loaded with stated dependencies ... OK * checking whether the namespace can be unloaded cleanly ... OK * checking dependencies in R code ... OK * checking S3 generic/method consistency ... OK * checking replacement functions ... OK * checking foreign function calls ... OK * checking R code for possible problems ... OK * checking Rd files ... OK * checking Rd metadata ... OK * checking Rd cross-references ... OK * checking for missing documentation entries ... OK * checking for code/documentation mismatches ... OK * checking Rd \usage sections ... OK * checking Rd contents ... OK * checking for unstated dependencies in examples ... OK * checking contents of ‘data’ directory ... OK * checking data for non-ASCII characters ... OK * checking data for ASCII and uncompressed saves ... OK * checking R/sysdata.rda ... OK * checking files in ‘vignettes’ ... OK * checking examples ... OK Examples with CPU (user + system) or elapsed time > 5s user system elapsed mScores_imputeFromReference 11.642 1.323 13.017 mScores_filterPaths 11.320 1.440 12.784 mScores_createReference 10.553 1.145 11.766 predictExternal 6.055 0.096 6.172 ann2term 5.388 0.209 5.621 * checking for unstated dependencies in ‘tests’ ... OK * checking tests ... Running ‘runTests.R’ ERROR Running the tests in ‘tests/runTests.R’ failed. Last 13 lines of output: pathMED RUnit Tests - 3 test functions, 0 errors, 1 failure FAILURE in test_getScores: Error in checkEquals(round(scoresExample[1, 1], 5), 43.8289) : Mean relative difference: 0.1730913 Test files with failing tests test_getScores.R test_getScores Error in BiocGenerics:::testPackage("pathMED") : unit tests failed for package pathMED In addition: There were 50 or more warnings (use warnings() to see the first 50) Execution halted * checking for unstated dependencies in vignettes ... OK * checking package vignettes ... OK * checking running R code from vignettes ... SKIPPED * checking re-building of vignette outputs ... SKIPPED * checking PDF version of manual ... OK * DONE Status: 1 ERROR See ‘/Users/biocbuild/bbs-3.21-bioc/meat/pathMED.Rcheck/00check.log’ for details.
pathMED.Rcheck/00install.out
############################################################################## ############################################################################## ### ### Running command: ### ### /Library/Frameworks/R.framework/Resources/bin/R CMD INSTALL pathMED ### ############################################################################## ############################################################################## * installing to library ‘/Library/Frameworks/R.framework/Versions/4.5-arm64/Resources/library’ * installing *source* package ‘pathMED’ ... ** this is package ‘pathMED’ version ‘1.0.0’ ** using staged installation ** R ** data ** inst ** byte-compile and prepare package for lazy loading ** help *** installing help indices ** building package indices ** installing vignettes ** testing if installed package can be loaded from temporary location ** testing if installed package can be loaded from final location ** testing if installed package keeps a record of temporary installation path * DONE (pathMED)
pathMED.Rcheck/tests/runTests.Rout.fail
R version 4.5.0 RC (2025-04-04 r88126) -- "How About a Twenty-Six" Copyright (C) 2025 The R Foundation for Statistical Computing Platform: aarch64-apple-darwin20 R is free software and comes with ABSOLUTELY NO WARRANTY. You are welcome to redistribute it under certain conditions. Type 'license()' or 'licence()' for distribution details. R is a collaborative project with many contributors. Type 'contributors()' for more information and 'citation()' on how to cite R or R packages in publications. Type 'demo()' for some demos, 'help()' for on-line help, or 'help.start()' for an HTML browser interface to help. Type 'q()' to quit R. > BiocGenerics:::testPackage("pathMED") Healthy samples supplied. Calculating M-Scores using healthy samples as reference for 30 samples | | | 0% | |== | 3% | |===== | 7% | |======= | 10% | |========= | 13% | |============ | 17% | |============== | 20% | |================ | 23% | |=================== | 27% | |===================== | 30% | |======================= | 33% | |========================== | 37% | |============================ | 40% | |============================== | 43% | |================================= | 47% | |=================================== | 50% | |===================================== | 53% | |======================================== | 57% | |========================================== | 60% | |============================================ | 63% | |=============================================== | 67% | |================================================= | 70% | |=================================================== | 73% | |====================================================== | 77% | |======================================================== | 80% | |========================================================== | 83% | |============================================================= | 87% | |=============================================================== | 90% | |================================================================= | 93% | |==================================================================== | 97% | |======================================================================| 100% ℹ GSVA version 2.2.0 ℹ Using a SnowParam parallel back-end with 1 workers ℹ Calculating GSVA ranks ℹ GSVA dense (classical) algorithm ℹ Row-wise ECDF estimation with Gaussian kernels ℹ Calculating GSVA column ranks ℹ Using a SnowParam parallel back-end with 1 workers ℹ Calculating GSVA scores ✔ Calculations finished ℹ GSVA version 2.2.0 ℹ Using a SnowParam parallel back-end with 1 workers ℹ Calculating ssGSEA scores for 5 gene sets ℹ Calculating ranks ℹ Calculating rank weights ℹ Normalizing ssGSEA scores ✔ Calculations finished ℹ GSVA version 2.2.0 ℹ Using a SnowParam parallel back-end with 1 workers ℹ Calculating PLAGE scores for 5 gene sets ℹ Centering and scaling values ✔ Calculations finished ℹ GSVA version 2.2.0 ℹ Using a SnowParam parallel back-end with 1 workers ℹ Calculating Z-scores for 5 gene sets ℹ Centering and scaling values ✔ Calculations finished Timing stopped at: 7.693 0.243 7.938 Error in checkEquals(round(scoresExample[1, 1], 5), 43.8289) : Mean relative difference: 0.1730913 In addition: Warning messages: 1: In checkGenes(upSet, rownames(rankData)) : 1 genes missing: IL8 2: In checkGenes(upSet, rownames(rankData)) : 21 genes missing: ITGA9, PTK2, TNC, PDGFRB, EGFR, ITGA5, LAMB1, COL3A1, LAMB2, KDR, ITGB3, COL5A2, COL5A1, COL4A2, COL4A1, FN1, ITGAV, MET, COL6A1, COL6A3, COL6A2 3: In checkGenes(upSet, rownames(rankData)) : 8 genes missing: TNC, FN1, FBN1, COL4A5, COL4A2, COL4A1, LAMB1, LAMB2 4: In checkGenes(upSet, rownames(rankData)) : 22 genes missing: CDH11, PDGFRB, VCAN, BGN, AEBP1, COL3A1, LUM, THBS2, NID2, NID1, PCOLCE, SPP1, LAMA4, FBN1, COL5A2, COL5A1, COL4A2, COL4A1, MXRA8, COL6A1, COL6A3, COL6A2 5: In checkGenes(upSet, rownames(rankData)) : 35 genes missing: TNFSF13, VCAN, TFF3, AEBP1, OLFM4, COL3A1, LOX, TNFAIP2, HTRA1, LILRB2, PLA2G7, CHIT1, THBS2, NID2, NID1, CRISP3, CST3, CTSH, LAMA4, ENTPD1, FBN1, COL5A2, POSTN, COL5A1, COL4A2, GRN, ACHE, LRG1, CXCL1, SMPDL3A, ORM1, FGL2, ANGPTL2, TGFBI, COL6A2 ℹ GSVA version 2.2.0 ! Some gene sets have size one. Consider setting minSize > 1 ℹ Using a SnowParam parallel back-end with 1 workers ℹ Calculating Z-scores for 408 gene sets ℹ Centering and scaling values ✔ Calculations finished Loading required namespace: randomForest Loading required namespace: gam Loading required namespace: xgboost Loading required namespace: ada Training models... | | | 0% | |=================================== | 50% | |======================================================================| 100% Done Calculating performance metrics... Done Training final model with all samples... Done ℹ GSVA version 2.2.0 ! Some gene sets have size one. Consider setting minSize > 1 ℹ Using a SnowParam parallel back-end with 1 workers ℹ Calculating Z-scores for 408 gene sets ℹ Centering and scaling values ✔ Calculations finished Positive class not provided, selected: 'Healthy_sample' ℹ GSVA version 2.2.0 ! Some gene sets have size one. Consider setting minSize > 1 ℹ Using a SnowParam parallel back-end with 1 workers ℹ Calculating Z-scores for 427 gene sets ℹ Centering and scaling values ✔ Calculations finished Training models... | | | 0% | |=================================== | 50% | |======================================================================| 100% Done Calculating performance metrics... Done Training final model with all samples... Done RUNIT TEST PROTOCOL -- Mon Apr 21 20:51:38 2025 *********************************************** Number of test functions: 3 Number of errors: 0 Number of failures: 1 1 Test Suite : pathMED RUnit Tests - 3 test functions, 0 errors, 1 failure FAILURE in test_getScores: Error in checkEquals(round(scoresExample[1, 1], 5), 43.8289) : Mean relative difference: 0.1730913 Test files with failing tests test_getScores.R test_getScores Error in BiocGenerics:::testPackage("pathMED") : unit tests failed for package pathMED In addition: There were 50 or more warnings (use warnings() to see the first 50) Execution halted
pathMED.Rcheck/pathMED-Ex.timings
name | user | system | elapsed | |
ann2term | 5.388 | 0.209 | 5.621 | |
buildRefObject | 0.050 | 0.003 | 0.053 | |
dissectDB | 1.360 | 0.029 | 1.390 | |
getScores | 1.393 | 0.006 | 1.400 | |
mScores_createReference | 10.553 | 1.145 | 11.766 | |
mScores_filterPaths | 11.320 | 1.440 | 12.784 | |
mScores_imputeFromReference | 11.642 | 1.323 | 13.017 | |
methodsML | 0.014 | 0.004 | 0.019 | |
predictExternal | 6.055 | 0.096 | 6.172 | |
trainModel | 4.313 | 0.059 | 4.373 | |