Friday, May 19, 2023 From rOpenSci (https://ropensci.org/blog/2023/05/19/ropensci-news-digest-may-2023/). Except where otherwise noted, content on this site is licensed under the CC-BY license.
Dear rOpenSci friends, it’s time for our monthly news roundup!
You can read this post on our blog. Now let’s dive into the activity at and around rOpenSci!
We are very happy to welcome Jouni Helske to the editorial team for statistical software review. Jouni jumped straight in to act as editor for the fwildclusterboot submission, an exciting first extension of our system beyond R alone to include functions in the Julia language. We have so far approved eight packages, five of which have a silver badge, and three gold, with two further packages currently under review.
We participated in the CSV,Conf,v7 in Buenos Aires April 19-20, 2023.
On the first day of the conference, Karthik Ram was one of the keynotes presenting “How to enable and sustain thriving Open Source Ecosystems (OSE)”. The next day, Yanina Bellini Saibene presented “Tell me who you hang out with, and I will tell you who you are. A collaborations analysis using social networks analysis”, a work done together with Sandro Camargo.
This was a very special edition as it was the first time the event took place in the Southern Hemisphere, allowing several Latin American projects to be part of the conference.
The R-Universe node stack now provides data export links, which use webr to convert pkg datasets on-the-fly to JSON (via jsonlite), xlsx (via writexl), csv (via data.table), etc.
Try it yourself, for instance with the webchem package’s two datasets. You can click on the download icons near their names, or use the direct URLs, for instance https://ropensci.r-universe.dev/webchem/data/lc50/json for the lc50 dataset (Acute toxicity data from U.S. EPA ECOTOX).
The permanent URL to a dataset in a given format can be used in your browser, from R, or from any other tools: this means the R-Universe helps publish your data to the world!
On the technical side, R-universe actually uses webr (with its own webr bundle) server-side, so not in a browser!
Note that R-universe already had a few features related to datasets beside listing them on a package page: datasets are indexed for search, and the standard API output for a package includes some info about datasets.
Maëlle Salmon was interviewed about her and Scott Chamberlain’s HTTP testing in R book on the R Consortium blog.
Join us for social coworking & office hours monthly on first Tuesdays! Hosted by Steffi LaZerte and various community hosts. Everyone welcome. No RSVP needed. Consult our Events page to find your local time and how to join.
Tuesday, June 6th, 9:00 Australia Western / 01:00 UTC “Integrating and merging datasets from different sources” Hosted by community host Cynthia Huang and Steffi LaZerte
Tuesday, July 4th, 14:00 European Central / 12:00 UTC “Create/Update your ‘Happy File’/‘Brag Document’!” Hosted by Maëlle Salmon and Steffi LaZerte
And remember, you can always cowork independently on work related to R, work on packages that tend to be neglected, or work on what ever you need to get done!
The following package recently became a part of our software suite:
Discover more packages, read more about Software Peer Review.
The following twelve packages have had an update since the last newsletter: biomartr (
v1.0.3), cffr (
v0.5.0), crul (
v1.4), geojsonio (
v0.11.1), nlrx (
v0.4.4), nodbi (
v0.9.4), osmdata (
v0.2.2), phruta (
MEE), spiro (
v0.2.0), tarchetypes (
0.7.6), targets (
1.0.0), and waywiser (
There are ten recently closed and active submissions and 3 submissions on hold. Issues are at different stages:
One at ‘5/awaiting-reviewer(s)-response’:
Two at ‘4/review(s)-in-awaiting-changes’:
wmm, World Magnetic Model. Submitted by Will Frierson.
octolog, Better Github Action Logging. Submitted by Jacob Wujciak-Jens.
Four at ‘3/reviewer(s)-assigned’:
pangoling, Access to Large Language Model Predictions. Submitted by Bruno Nicenboim.
ohun, Optimizing Acoustic Signal Detection. Submitted by Marcelo Araya-Salas.
dfms, Dynamic Factor Models. Submitted by Sebastian Krantz.
fwildclusterboot, Fast Wild Cluster Bootstrap Inference for Linear Models. Submitted by Alexander Fischer. (Stats).
Two at ‘2/seeking-reviewer(s)’:
mregions2, Access Data from Marineregions.org: The Marine Regions Gazetteer and the Marine Regions Data Products. Submitted by salvafern.
bssm, Bayesian Inference of Non-Linear and Non-Gaussian State Space. Submitted by Jouni Helske. (Stats).
One at ‘1/editor-checks’:
Find out more about Software Peer Review and how to get involved.
rOpenSci Champions Program Teams: Meet Bilikisu Wunmi Olatunji and Christina Maimone by Bilikisu Wunmi Olatunji, and Christina Maimone.
rOpenSci Champions Program Teams: Meet Carolina Pradier and Athanasia Monika Mowinckel by Carolina Pradier, and Athanasia Monika Mowinckel.
rOpenSci Champions Program Teams: Meet Haydee Svab and Beatriz Milz by Haydee Svab, and Beatriz Milz.
rOpenSci Champions Program Teams: Meet Victor Ordu and Laura DeCicco by Victor Ordu, and Laura DeCicco.
rOpenSci Champions Program Teams: Meet César and Marc by Cesar Luis Aybar Camacho, and Marc Choisy.
If you’re interested in maintaining any of the R packages below, you might enjoy reading our blog post What Does It Mean to Maintain a Package? (or listening to its discussion on the R Weekly highlights podcast hosted by Eric Nantz and Mike Thomas)!
rvertnet, Retrieve, map and summarize data from the VertNet.org archives (https://vertnet.org/). Functions allow searching by many parameters, including taxonomic names, places, and dates. In addition, there is an interface for conducting spatially delimited searches, and another for requesting large datasets via email. Issue for volunteering.
natserv. Interface to NatureServe (https://www.natureserve.org/). Includes methods to get data, image metadata, search taxonomic names, and make maps. Issue for volunteering.
Refer to our somewhat recent blog post to identify other packages where help is especially wished for! See also our help wanted page – before opening a PR, we recommend asking in the issue whether help is still needed.
Some useful tips for R package developers. 👀
Are you thinking about building an API client? Check out the new project by Jon Harmon, recently funded by the R Consortium:
Exciting to follow!
Did you miss the recent coworking session on Spring/Fall cleaning of R packages?
One nice function that was shared during that meeting, by Andy Teucher, R Package Developer Educator at Posit PBC is
usethis::use_upkeep_issue() in the development version of usethis.
“This opens an issue in your package repository with a checklist of tasks for regular maintenance of your package.”
Why not try it on one of your packages?
The tidyverse team drafted a guide about their code review process.
On the same topic, have you heard of optimistic merging?
At rOpenSci, we’ve recommended using more actively maintained packages (like xml2 and crul or curl) rather than the unmaintained XML and RCurl packages for a long time. It seems the CRAN team is now looking for new maintainers for them. See also the relevant thread of the R Consortium’s repositories working group. This last item suggests that maintenance of these packages may be moving towards graceful deprecation, and is another reason to move your package away from XML and RCurl when you can!
As stated in the dev guide
For spatial data, the sp package should be considered deprecated in favor of sf, and the packages rgdal, maptools, and rgeos will be retired by the end of 2023. We recommend use of the spatial suites developed by the r-spatial and rspatial communities. See this GitHub issue for relevant discussions.
Please tell your friends!
Thanks for reading! If you want to get involved with rOpenSci, check out our Contributing Guide that can help direct you to the right place, whether you want to make code contributions, non-code contributions, or contribute in other ways like sharing use cases.
If you haven’t subscribed to our newsletter yet, you can do so via a form. Until it’s time for our next newsletter, you can keep in touch with us via our website and Mastodon account.