Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using the scVI method of Integration in Seurat v5 #7164

Closed
cathalgking opened this issue Apr 14, 2023 · 42 comments
Closed

Using the scVI method of Integration in Seurat v5 #7164

cathalgking opened this issue Apr 14, 2023 · 42 comments
Labels
bug Something isn't working

Comments

@cathalgking
Copy link

I am comparing Integration methods using the IntegrateLayers() function and am running into an Error when trying to use the scVI method. The other methods (harmony, rpca and cca) I have tried work fine.
I am pointing to an anaconda environment with scvi tools installed. I believe the Error is related to the ''method='' argument in the function call. The package notes does not list the scVI method as an integration method (?IntegrateLayers) however the Integrative analysis vignette uses this method just fine.

Error in is_quosure(x = method) : object 'scVIIntegration' not found

object <- IntegrateLayers(
  object = object, method = scVIIntegration,
  new.reduction = "integrated.scvi",
  conda_env = "/Users/cathal.king/opt/anaconda3/envs/scvi-env", verbose = FALSE
)

https://satijalab.org/seurat/articles/seurat5_integration.html#layers-in-the-seurat-v5-object

@cathalgking cathalgking added the bug Something isn't working label Apr 14, 2023
@mdu4003
Copy link

mdu4003 commented Apr 17, 2023

Same issue here.

@codeneeded
Copy link

Fastmnn also doesnt work despite being in the documentation.

@Gesmira
Copy link
Contributor

Gesmira commented Apr 18, 2023

Hi all,
Have you installed the v5 version of SeuratWrappers as well?

remotes::install_github("satijalab/seurat-wrappers", "seurat5", quiet = TRUE)

@mdu4003
Copy link

mdu4003 commented Apr 18, 2023

Hello Gesmira,
Yes, I did install the v5 version.
Thanks

@Gesmira
Copy link
Contributor

Gesmira commented Apr 18, 2023

And have you loaded it in with library(SeuratWrappers)? When doing so, I can see FastMNNIntegration and scVIIntegration in the IntegrateLayers documentation and can use them

@mdu4003
Copy link

mdu4003 commented Apr 18, 2023

Sorry Gesmira, is v 0.3.1 of SeuratWrapppers the correct one?
I still do not see the scVIIntegration command

@mdu4003
Copy link

mdu4003 commented Apr 18, 2023

Also, when I try
remotes::install_github("satijalab/seurat-wrappers", "seurat5", quiet = TRUE)
I get
Warning message:
In i.p(...) :
installation of package ‘/var/folders/v3/wf1x89bd17v8m0dwn7mzhx7r0000gn/T//Rtmp36C7vs/file1c1d769e1e19/SeuratWrappers_0.3.1.tar.gz’ had non-zero exit status

But it seems I installed it with
devtools::install_github('satijalab/seurat-wrappers')

@Gesmira
Copy link
Contributor

Gesmira commented Apr 18, 2023

I just updated the version number so it is now 0.3.19. Can you repeat the installation with quiet = FALSE so we can see why it is failing?
In the devtools version you include, you are not installing the seurat5 version of SeuratWrappers so you will not have those new methods available. Assuming you are missing some dependencies, make sure you have installed all v5 versions in a fresh R session. Alternatively, if you prever to use devtools, you can also run: devtools::install_github('satijalab/seurat-wrappers', 'seurat5)

@mdu4003
Copy link

mdu4003 commented Apr 18, 2023

I got this

remotes::install_github("satijalab/seurat-wrappers", "seurat5", quiet = FALSE)
Using github PAT from envvar GITHUB_PAT
Downloading GitHub repo satijalab/seurat-wrappers@seurat5
── R CMD build ─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
✔ checking for file ‘/private/var/folders/v3/wf1x89bd17v8m0dwn7mzhx7r0000gn/T/Rtmp36C7vs/remotes1c1d315403bb/satijalab-seurat-wrappers-a59e9b2/DESCRIPTION’ ...
─ preparing ‘SeuratWrappers’:
✔ checking DESCRIPTION meta-information ...
─ checking for LF line-endings in source and make files and shell scripts
─ checking for empty or unneeded directories
Omitted ‘LazyData’ from DESCRIPTION
─ building ‘SeuratWrappers_0.3.19.tar.gz’

  • installing source package ‘SeuratWrappers’ ...
    ** using staged installation
    ** R
    ** byte-compile and prepare package for lazy loading
    ** help
    *** installing help indices
    ** building package indices
    ** testing if installed package can be loaded from temporary location
    Error: package or namespace load failed for ‘SeuratWrappers’ in namespaceExport(ns, exports):
    undefined exports: [batchelor]{fastMNN}
    Error: loading failed
    Execution halted
    ERROR: loading failed
  • removing ‘/Library/Frameworks/R.framework/Versions/4.2/Resources/library/SeuratWrappers’
  • restoring previous ‘/Library/Frameworks/R.framework/Versions/4.2/Resources/library/SeuratWrappers’
    Warning message:
    In i.p(...) :
    installation of package ‘/var/folders/v3/wf1x89bd17v8m0dwn7mzhx7r0000gn/T//Rtmp36C7vs/file1c1d2ff843a7/SeuratWrappers_0.3.19.tar.gz’ had non-zero exit status

@inofechm
Copy link

I managed to get fastmnn working in my fork of the repository.
I had to switch export("[batchelor]{fastMNN}") to export(FastMNNIntegration) in the namespace file.
you can install it using remotes::install_github('inofechm/seurat-wrappers', 'seurat5')

@mdu4003
Copy link

mdu4003 commented Apr 18, 2023

Hello @inofechm,
This helped in Rstudio. I could install iand when I run IntegrateLayers now the error is different and I suppose it is coming from my scvi conda environment.
My error is:
Error: ModuleNotFoundError: No module named 'scanpy'.
I checked and I do have scanpy. What should I do?

scanpy 1.9.3 pypi_0 pypi

These are all the packages in my scvi environment:

Name Version Build Channel

absl-py 1.4.0 pyhd8ed1ab_0 conda-forge
anndata 0.8.0 pyhd8ed1ab_1 conda-forge
brotli 1.0.9 hb7f2c08_8 conda-forge
brotli-bin 1.0.9 hb7f2c08_8 conda-forge
bzip2 1.0.8 h0d85af4_4 conda-forge
c-ares 1.18.1 h0d85af4_0 conda-forge
ca-certificates 2022.12.7 h033912b_0 conda-forge
cached-property 1.5.2 hd8ed1ab_1 conda-forge
cached_property 1.5.2 pyha770c72_1 conda-forge
certifi 2022.12.7 pyhd8ed1ab_0 conda-forge
charset-normalizer 3.1.0 pypi_0 pypi
chex 0.1.7 pyhd8ed1ab_0 conda-forge
colorama 0.4.6 pyhd8ed1ab_0 conda-forge
contextlib2 21.6.0 pyhd8ed1ab_0 conda-forge
contourpy 1.0.7 py39h92daf61_0 conda-forge
cycler 0.11.0 pyhd8ed1ab_0 conda-forge
dm-tree 0.1.7 py39hbd61c47_0 conda-forge
docrep 0.3.2 pyh44b312d_0 conda-forge
et_xmlfile 1.1.0 pyhd8ed1ab_0 conda-forge
filelock 3.11.0 pyhd8ed1ab_0 conda-forge
flax 0.6.1 pyhd8ed1ab_1 conda-forge
fonttools 4.39.3 py39ha30fb19_0 conda-forge
freetype 2.12.1 h3f81eb7_1 conda-forge
fsspec 2023.3.0 pyhd8ed1ab_1 conda-forge
gmp 6.2.1 h2e338ed_0 conda-forge
gmpy2 2.1.2 py39h2da61ea_1 conda-forge
h5py 3.8.0 nompi_py39h03e16b4_101 conda-forge
hdf5 1.14.0 nompi_hbf0aa07_103 conda-forge
idna 3.4 pypi_0 pypi
importlib-metadata 6.2.0 pyha770c72_0 conda-forge
importlib-resources 5.12.0 pyhd8ed1ab_0 conda-forge
importlib_metadata 6.2.0 hd8ed1ab_0 conda-forge
importlib_resources 5.12.0 pyhd8ed1ab_0 conda-forge
jax 0.3.25 pyhd8ed1ab_0 conda-forge
jaxlib 0.3.25 py39h5c6ac89_1
jinja2 3.1.2 pyhd8ed1ab_1 conda-forge
joblib 1.2.0 pyhd8ed1ab_0 conda-forge
kiwisolver 1.4.4 py39h92daf61_1 conda-forge
krb5 1.20.1 h049b76e_0 conda-forge
lcms2 2.15 h2dcdeff_1 conda-forge
lerc 4.0.0 hb486fe8_0 conda-forge
libaec 1.0.6 hf0c8a7f_1 conda-forge
libblas 3.9.0 16_osx64_openblas conda-forge
libbrotlicommon 1.0.9 hb7f2c08_8 conda-forge
libbrotlidec 1.0.9 hb7f2c08_8 conda-forge
libbrotlienc 1.0.9 hb7f2c08_8 conda-forge
libcblas 3.9.0 16_osx64_openblas conda-forge
libcurl 7.88.1 h6df9250_1 conda-forge
libcxx 14.0.6 h9765a3e_0
libdeflate 1.18 hac1461d_0 conda-forge
libedit 3.1.20191231 h0678c8f_2 conda-forge
libev 4.33 haf1e3a3_1 conda-forge
libffi 3.4.2 hecd8cb5_6
libgfortran 5.0.0 11_3_0_h97931a8_31 conda-forge
libgfortran5 12.2.0 he409387_31 conda-forge
libjpeg-turbo 2.1.5.1 hb7f2c08_0 conda-forge
liblapack 3.9.0 16_osx64_openblas conda-forge
libnghttp2 1.52.0 he2ab024_0 conda-forge
libopenblas 0.3.21 openmp_h429af6e_3 conda-forge
libpng 1.6.39 ha978bb4_0 conda-forge
libprotobuf 3.21.12 hbc0c0cd_0 conda-forge
libsqlite 3.40.0 ha978bb4_0 conda-forge
libssh2 1.10.0 h47af595_3 conda-forge
libtiff 4.5.0 hedf67fa_6 conda-forge
libwebp-base 1.3.0 hb7f2c08_0 conda-forge
libxcb 1.13 h0d85af4_1004 conda-forge
libzlib 1.2.13 hfd90126_4 conda-forge
lightning-utilities 0.8.0 pyhd8ed1ab_0 conda-forge
llvm-openmp 16.0.1 h61d9ccf_0 conda-forge
llvmlite 0.39.1 pypi_0 pypi
markdown-it-py 2.2.0 pyhd8ed1ab_0 conda-forge
markupsafe 2.1.2 py39ha30fb19_0 conda-forge
matplotlib-base 3.7.1 py39hb2f573b_0 conda-forge
mdurl 0.1.0 pyhd8ed1ab_0 conda-forge
mkl 2022.2.1 h44ed08c_16952 conda-forge
ml-collections 0.1.1 pyhd8ed1ab_0 conda-forge
mpc 1.3.1 h81bd1dd_0 conda-forge
mpfr 4.2.0 h4f9bd69_0 conda-forge
mpmath 1.3.0 pyhd8ed1ab_0 conda-forge
msgpack-python 1.0.5 py39h92daf61_0 conda-forge
mudata 0.2.1 pyhd8ed1ab_0 conda-forge
multipledispatch 0.6.0 py_0 conda-forge
munkres 1.1.4 pyh9f0ad1d_0 conda-forge
natsort 8.3.1 pyhd8ed1ab_0 conda-forge
ncurses 6.4 hcec6c5f_0
networkx 3.1 pyhd8ed1ab_0 conda-forge
numba 0.56.4 pypi_0 pypi
numpy 1.23.5 pypi_0 pypi
numpyro 0.11.0 pyhd8ed1ab_0 conda-forge
openjpeg 2.5.0 h13ac156_2 conda-forge
openpyxl 3.1.1 py39ha30fb19_0 conda-forge
openssl 3.1.0 hfd90126_0 conda-forge
opt_einsum 3.3.0 pyhd8ed1ab_1 conda-forge
optax 0.1.4 pyhd8ed1ab_0 conda-forge
packaging 23.0 pyhd8ed1ab_0 conda-forge
pandas 2.0.0 py39hecff1ad_0 conda-forge
patsy 0.5.3 pypi_0 pypi
pillow 9.5.0 py39h77c96bc_0 conda-forge
pip 23.0.1 py39hecd8cb5_0
pthread-stubs 0.4 hc929b4f_1001 conda-forge
pygments 2.14.0 pyhd8ed1ab_0 conda-forge
pynndescent 0.5.9 pypi_0 pypi
pyparsing 3.0.9 pyhd8ed1ab_0 conda-forge
pyro-api 0.1.2 pyhd8ed1ab_0 conda-forge
pyro-ppl 1.8.4 pyhd8ed1ab_0 conda-forge
python 3.9.16 h709bd14_0_cpython conda-forge
python-dateutil 2.8.2 pyhd8ed1ab_0 conda-forge
python-tzdata 2023.3 pyhd8ed1ab_0 conda-forge
python_abi 3.9 2_cp39 conda-forge
pytorch 2.0.0 cpu_py39h5404d98_0 conda-forge
pytorch-lightning 1.9.4 pyhd8ed1ab_1 conda-forge
pytz 2023.3 pyhd8ed1ab_0 conda-forge
pyyaml 6.0 py39ha30fb19_5 conda-forge
readline 8.2 hca72f7f_0
requests 2.28.2 pypi_0 pypi
rich 13.3.3 pyhd8ed1ab_0 conda-forge
scanpy 1.9.3 pypi_0 pypi
scikit-learn 1.2.2 py39h151e6e4_1 conda-forge
scipy 1.9.3 py39h8a15683_2 conda-forge
scvi-tools 0.20.3 pyhd8ed1ab_0 conda-forge
seaborn 0.12.2 pypi_0 pypi
session-info 1.0.0 pypi_0 pypi
setuptools 65.6.3 py39hecd8cb5_0
six 1.16.0 pyh6c4a22f_0 conda-forge
sleef 3.5.1 h6db0672_2 conda-forge
sqlite 3.41.1 h6c40b1e_0
statsmodels 0.13.5 pypi_0 pypi
stdlib-list 0.8.0 pypi_0 pypi
sympy 1.11.1 pypyh9d50eac_103 conda-forge
tbb 2021.8.0 hb8565cd_0 conda-forge
threadpoolctl 3.1.0 pyh8a188c0_0 conda-forge
tk 8.6.12 h5d9f67b_0
toolz 0.12.0 pyhd8ed1ab_0 conda-forge
torchaudio 2.0.1 pypi_0 pypi
torchmetrics 0.11.4 pyhd8ed1ab_0 conda-forge
torchvision 0.15.1 pypi_0 pypi
tqdm 4.65.0 pyhd8ed1ab_1 conda-forge
typing-extensions 4.5.0 hd8ed1ab_0 conda-forge
typing_extensions 4.5.0 pyha770c72_0 conda-forge
tzdata 2023c h04d1e81_0
umap-learn 0.5.3 pypi_0 pypi
unicodedata2 15.0.0 py39ha30fb19_0 conda-forge
urllib3 1.26.15 pypi_0 pypi
wheel 0.38.4 py39hecd8cb5_0
xlrd 1.2.0 pyh9f0ad1d_1 conda-forge
xorg-libxau 1.0.9 h35c211d_0 conda-forge
xorg-libxdmcp 1.1.3 h35c211d_0 conda-forge
xz 5.2.10 h6c40b1e_1
yaml 0.2.5 h0d85af4_2 conda-forge
zipp 3.15.0 pyhd8ed1ab_0 conda-forge
zlib 1.2.13 hfd90126_4 conda-forge
zstd 1.5.2 hbc0c0cd_6 conda-forge

@Gesmira
Copy link
Contributor

Gesmira commented Apr 18, 2023

I managed to get fastmnn working in my fork of the repository.
I had to switch export("[batchelor]{fastMNN}") to export(FastMNNIntegration) in the namespace file.
you can install it using remotes::install_github('inofechm/seurat-wrappers', 'seurat5')

@inofechm thanks for addressing the fix! Would you be able to open a PR to seurat-wrappers with your changes? Thanks!

@Gesmira
Copy link
Contributor

Gesmira commented Apr 18, 2023

@mdu4003 Yes, seems to be an error with the environment and likely reticulate. Are you pointing to the correct conda environment in the IntegrateLayers call? Another fix could be to restart your R session and trying again to ensure reticulate is using the correct environment.

@inofechm
Copy link

I managed to get fastmnn working in my fork of the repository.
I had to switch export("[batchelor]{fastMNN}") to export(FastMNNIntegration) in the namespace file.
you can install it using remotes::install_github('inofechm/seurat-wrappers', 'seurat5')

@inofechm thanks for addressing the fix! Would you be able to open a PR to seurat-wrappers with your changes? Thanks!

I've made a PR. Thanks for developing these exciting new features, they've been awesome in my hands!

@Gesmira
Copy link
Contributor

Gesmira commented Apr 18, 2023

Great, it's merged now! Glad to hear it!

@mdu4003
Copy link

mdu4003 commented Apr 18, 2023

Hi!
Thank you very much @inofechm.
I checked and correct my environment. I think it is the correct one now.
I run into another type of error this time. I do not know what to do here.
Would you mind checking it, please?
Thank you very much
obj_SCVI<-IntegrateLayers(object=obj, method=scVIIntegration, new.reduction="integrated.scvi",conda_env="/Users/diazmeco/.conda/envs/scvi-env")
Global seed set to 0
/Users/diazmeco/.conda/envs/scvi-env/lib/python3.9/site-packages/flax/struct.py:132: FutureWarning: jax.tree_util.register_keypaths is deprecated, and will be removed in a future release. Please use register_pytree_with_keys() instead.
jax.tree_util.register_keypaths(data_clz, keypaths)
/Users/diazmeco/.conda/envs/scvi-env/lib/python3.9/site-packages/flax/struct.py:132: FutureWarning: jax.tree_util.register_keypaths is deprecated, and will be removed in a future release. Please use register_pytree_with_keys() instead.
jax.tree_util.register_keypaths(data_clz, keypaths)
Error in UseMethod(generic = "JoinLayers", object = object) :
no applicable method for 'JoinLayers' applied to an object of class "c('SCTAssay', 'Assay', 'KeyMixin')"

@Gesmira
Copy link
Contributor

Gesmira commented Apr 18, 2023

Can you share all the code you ran before IntegrateLayers()?
It seems like your assays may not be v5 assays. Have you run this: options(Seurat.object.assay.version = "v5") before creating your SeuratObject?

@mdu4003
Copy link

mdu4003 commented Apr 18, 2023

Hello @Gesmira,
Thank you very much for your help.
That's what I thought too.
I restart the session and this is my code:

library(Seurat)
options(Seurat.object.assay.version = "v5")
library(SeuratData)
library(SeuratWrappers)
library(Azimuth)
library(ggplot2)
library(patchwork)
library(gtable)
library(reticulate)
library(harmony)
library(pbmcref.SeuratData)
library(SeuratDisk)
options(future.globals.maxSize = 1e9)

#Integrative analysis in Seurat v5
#Compiled: March 27, 2023
#Source: vignettes/seurat5_integration.Rmd

#Layers in the Seurat v5 object
#Seurat v5 assays store data in layers. These layers can store raw, un-normalized counts (layer='counts'), normalized data (layer='data'), or z-scored/variance-stabilized data (layer='scale.data'). We can load in the data, remove low-quality cells, and obtain predicted cell annotations (which will be useful for assessing integration later), using our Azimuth pipeline.

load dataset

FU_I_d1_T1_counts <- read.csv(file = "/Users/diazmeco/MoscatDiazMecoLab Dropbox/Moscat Lab/0.OMICS_WCM/Z.Z.COLLABORATIONS/NGS_Luxembourg_ElizabethLetellier_feb2023/seurat_GSE134255_GSE199999/GSE199999/GSE199999_RAW/GSM6001734_281_d1_1.csv.gz", header = TRUE, row.names = 1)

FU_I_d1_T1 <- CreateSeuratObject(counts = t(FU_I_d1_T1_counts), project = "d1_1", min.cells = 3, min.features = 200)
FU_I_d1_T1

An object of class Seurat
15702 features across 752 samples within 1 assay
Active assay: RNA (15702 features, 0 variable features)
1 layer present: counts

@Gesmira
Copy link
Contributor

Gesmira commented Apr 18, 2023

Do you then split this object into multiple layers for doing integration? Like in the vignette: obj[["RNA"]] <- split(obj[["RNA"]], f = obj$Method) or do you add in other objects?

@mdu4003
Copy link

mdu4003 commented Apr 18, 2023

Sorry @Gesmira. I have 4 objects (different collection times)

This is the full code:

library(Seurat)
options(Seurat.object.assay.version = "v5")
library(SeuratData)
library(SeuratWrappers)
library(Azimuth)
library(ggplot2)
library(patchwork)
library(gtable)
library(reticulate)
library(harmony)
library(pbmcref.SeuratData)
library(SeuratDisk)
options(future.globals.maxSize = 1e9)

#Integrative analysis in Seurat v5
#Compiled: March 27, 2023
#Source: vignettes/seurat5_integration.Rmd

#Layers in the Seurat v5 object
#Seurat v5 assays store data in layers. These layers can store raw, un-normalized counts (layer='counts'), normalized data (layer='data'), or z-scored/variance-stabilized data (layer='scale.data'). We can load in the data, remove low-quality cells, and obtain predicted cell annotations (which will be useful for assessing integration later), using our Azimuth pipeline.

load dataset

FU_I_d1_T1_counts <- read.csv(file = "/Users/diazmeco/MoscatDiazMecoLab Dropbox/Moscat Lab/0.OMICS_WCM/Z.Z.COLLABORATIONS/NGS_Luxembourg_ElizabethLetellier_feb2023/seurat_GSE134255_GSE199999/GSE199999/GSE199999_RAW/GSM6001734_281_d1_1.csv.gz", header = TRUE, row.names = 1)
FU_I_d1_T2_counts <- read.csv(file = "/Users/diazmeco/MoscatDiazMecoLab Dropbox/Moscat Lab/0.OMICS_WCM/Z.Z.COLLABORATIONS/NGS_Luxembourg_ElizabethLetellier_feb2023/seurat_GSE134255_GSE199999/GSE199999/GSE199999_RAW/GSM6001735_3927_AS6_d1_2.csv.gz", header = TRUE, row.names = 1)
FU_I_d3_counts <- read.csv(file = "/Users/diazmeco/MoscatDiazMecoLab Dropbox/Moscat Lab/0.OMICS_WCM/Z.Z.COLLABORATIONS/NGS_Luxembourg_ElizabethLetellier_feb2023/seurat_GSE134255_GSE199999/GSE199999/GSE199999_RAW/GSM6001736_282_d3.csv.gz", header = TRUE, row.names = 1)
FU_I_d6_counts <- read.csv(file = "/Users/diazmeco/MoscatDiazMecoLab Dropbox/Moscat Lab/0.OMICS_WCM/Z.Z.COLLABORATIONS/NGS_Luxembourg_ElizabethLetellier_feb2023/seurat_GSE134255_GSE199999/GSE199999/GSE199999_RAW/GSM6001737_4240_AS2_d6.csv.gz", header = TRUE, row.names = 1)

FU_I_d1_T1 <- CreateSeuratObject(counts = t(FU_I_d1_T1_counts), project = "d1_1", min.cells = 3, min.features = 200)
FU_I_d1_T2<- CreateSeuratObject(counts = t(FU_I_d1_T2_counts), project = "d1_2", min.cells = 3, min.features = 200)
FU_I_d3 <- CreateSeuratObject(counts = t(FU_I_d3_counts), project = "d3", min.cells = 3, min.features = 200)
FU_I_d6 <- CreateSeuratObject(counts = t(FU_I_d6_counts), project = "d6", min.cells = 3, min.features = 200)

obj<-merge(x = FU_I_d1_T1, y = list(FU_I_d1_T2,FU_I_d3,FU_I_d6))

obj <- NormalizeData(GSE199999)
obj <- FindVariableFeatures(obj)
obj <- ScaleData(obj)
obj <- RunPCA(obj)
obj <- FindNeighbors(obj, dims = 1:20, reduction = "pca")
obj <- FindClusters(obj, resolution = 2, cluster.name = "unintegrated_clusters")
obj <- RunUMAP(obj, dims = 1:20, reduction = "pca", reduction.name = "umap.unintegrated")
obj
#An object of class Seurat
#18838 features across 7451 samples within 1 assay
#Active assay: RNA (18838 features, 2000 variable features)
#9 layers present: counts.d1_1, counts.d1_2, counts.d3, counts.d6, data.d1_1, data.d1_2, data.d3, data.d6, scale.data
#2 dimensional reductions calculated: pca, umap.unintegrated

visualize by batch and cell type annotation

cell type annotations were previously added by Azimuth (I SKIPPED)

#Perform streamlined (one-line) integrative analysis

obj_SCVI<-IntegrateLayers(object=obj, orig.reduction = "pca", method=scVIIntegration, new.reduction='integrated.scvi', conda_env="/Users/diazmeco/.conda/envs/scvi-env", group.by = "orig.ident")


I think it is running ok now.
I am seeing this:

obj_SCVI<-IntegrateLayers(object=obj, orig.reduction = "pca", method=scVIIntegration, new.reduction='integrated.scvi', conda_env="/Users/diazmeco/.conda/envs/scvi-env", group.by = "orig.ident")
Global seed set to 0
/Users/diazmeco/.conda/envs/scvi-env/lib/python3.9/site-packages/flax/struct.py:132: FutureWarning: jax.tree_util.register_keypaths is deprecated, and will be removed in a future release. Please use register_pytree_with_keys() instead.
jax.tree_util.register_keypaths(data_clz, keypaths)
/Users/diazmeco/.conda/envs/scvi-env/lib/python3.9/site-packages/flax/struct.py:132: FutureWarning: jax.tree_util.register_keypaths is deprecated, and will be removed in a future release. Please use register_pytree_with_keys() instead.
jax.tree_util.register_keypaths(data_clz, keypaths)
sys:1: FutureWarning: X.dtype being converted to np.float32 from float64. In the next version of anndata (0.9) conversion will not be automatic. Pass dtype explicitly to avoid this warning. Pass AnnData(X, dtype=X.dtype, ...) to get the future behavour.
GPU available: False, used: False
TPU available: False, using: 0 TPU cores
IPU available: False, using: 0 IPUs
HPU available: False, using: 0 HPUs
Epoch 205/400: 51%|█████ | 204/400 [07:31<06:50, 2.10s/it, loss=720, v_num=1]

@Gesmira
Copy link
Contributor

Gesmira commented Apr 18, 2023

Ok nice, if it's running I believe you can ignore the warnings for now. Let me know if it finishes running!

@mdu4003
Copy link

mdu4003 commented Apr 18, 2023

Hello @Gesmira.
It run perfectly. Thank you very much to both, @Gesmira and @inofechm!
The difference between using SCVI vs Harmony is incredible.
Thank you very much for including scvi in the pipeline.

@mdu4003
Copy link

mdu4003 commented Apr 18, 2023

Only 1 more question.
How do you save this new seurat v5 object? Do you use saveRDS as usual?
Thanks!!

@Gesmira
Copy link
Contributor

Gesmira commented Apr 18, 2023

Happy to help @mdu4003! Yep, you can save the seurat object with saveRDS as usual.

@Gesmira Gesmira closed this as completed Apr 18, 2023
@cathalgking
Copy link
Author

Thanks for that @Gesmira

I no longer get the same error, after I installed SeuratWrapper. However, I now get a different error, something to do with ''mach-o file''

Error: /Users/cathal.king/opt/anaconda3/envs/scvi-env/lib/libpython3.9.dylib - dlopen(/Users/cathal.king/opt/anaconda3/envs/scvi-env/lib/libpython3.9.dylib, 0x000A): tried: '/Users/cathal.king/opt/anaconda3/envs/scvi-env/lib/libpython3.9.dylib' (mach-o file, but is an incompatible architecture (have (x86_64), need (arm64e)))

Have you any idea how to solve that?

@joaolsf
Copy link

joaolsf commented Apr 26, 2023

Hi, I continue getting the same error as @mdu4003, even after splitting the object and converting to a v5 assay.
Error in UseMethod(generic = "JoinLayers", object = object) :
no applicable method for 'JoinLayers' applied to an object of class "c('SCTAssay', 'Assay', 'KeyMixin')"

@joaolsf
Copy link

joaolsf commented Apr 26, 2023

The error changed when I split the object using the "SCT" assay or if I ran the IntegrateLayers() function to the "RNA" assay (splitting based on the "RNA" slot) as below:
Error in IntegrateLayers():
! None of the features provided are found in this assay

@Gesmira
Copy link
Contributor

Gesmira commented Apr 27, 2023

Hi @joaoufrj. The initial error you included implied you are calling JoinLayers on an assay that is not "Assay5". Can you confirm that inherits(obj[["RNA"]], "Assay5") returns TRUE?

@joaolsf
Copy link

joaolsf commented Apr 28, 2023 via email

@stefanonard85
Copy link

inherits(obj[["RNA"]], "Assay5")

I have the same issue: see below.,

RES <- JoinLayers(RES)
Error in UseMethod(generic = "JoinLayers", object = object) :
no applicable method for 'JoinLayers' applied to an object of class "c('SCTAssay', 'Assay', 'KeyMixin')"

@ishwarvh
Copy link

ishwarvh commented Dec 2, 2023

Hi, after splitting the object there was a message saying the assays were originally v3 but were converted to v5. Cheers, Joao

On 27 Apr 2023 at 9:05 PM +0100, gesmira @.>, wrote: Hi @joaoufrj. The initial error you included implied you are calling JoinLayers on an assay that is not "Assay5". Can you confirm that inherits(obj[["RNA"]], "Assay5") returns TRUE? — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.Message ID: @.>

Hey @joaolsf , I am assuming you were trying to integrate a sctransformed seurat object with scVI. Were you successful in doing this?

@ishwarvh
Copy link

ishwarvh commented Dec 4, 2023

The error changed when I split the object using the "SCT" assay or if I ran the IntegrateLayers() function to the "RNA" assay (splitting based on the "RNA" slot) as below: Error in IntegrateLayers(): ! None of the features provided are found in this assay

Hey, were you able to integrate a SCTransformed seurat object using SCVI in Seurat V5?. I am facing the same problems as you!

@hassansaei
Copy link

The error changed when I split the object using the "SCT" assay or if I ran the IntegrateLayers() function to the "RNA" assay (splitting based on the "RNA" slot) as below: Error in IntegrateLayers(): ! None of the features provided are found in this assay

Hey, were you able to integrate a SCTransformed seurat object using SCVI in Seurat V5?. I am facing the same problems as you!

I encountered the same issue with seuratwrappers, and I resolved it using the following steps:

Assuming 'Data' is your Seurat V5 object with multiple layers.

Ensure the "RNA" layer is treated as an Assay

Data[["RNA"]] <- as(object = Data[["RNA"]], Class = "Assay")

Convert to anndata format

Data <- convertFormat(Data, from = "seurat", to = "anndata", main_layer = "counts", drop_single_values = FALSE)

Set up and train the scvi model

You can include your categorical or continuous covariates

scvi$model$SCVI$setup_anndata(Data, categorical_covariate_keys = c("group", "sex"), continuous_covariate_keys = c("nCount_RNA", "percent.mt"))
model <- scvi$model$SCVI(Data)
model$train()

Retrieve the latent representation

latent <- model.get_latent_representation()

Prepare data frames and matrices

x <- as.data.frame(colnames(Data))
latent_df <- as.data.frame(latent)
rownames(latent_df) <- x$colnames(Data)
latent_matrix <- as.matrix(latent_df)

Create a new dimension reduction object with scvi embeddings

Data[["scvi"]] <- CreateDimReducObject(embeddings = latent_matrix, key = "scvi_", assay = DefaultAssay(Data))

Run FindNeighbors, FindClusters, and RunUMAP with reduction = "scvi"

@ishwarvh
Copy link

@hassansaei Thank you for your input. I ended up using scVI in Python.
As it stands scVI implementation in Seurat isn't quite there.

@redtorrentCN
Copy link

@hassansaei Hi, sorry to bother you! when I run here:

model$train()

An error occurred as following:

GPU available: False, used: False
TPU available: False, using: 0 TPU cores
Error in py_call_impl(callable, call_args$unnamed, call_args$named) :
AttributeError: 'DataSplitter' object has no attribute '_has_setup_TrainerFn.FITTING'

── Python Exception Message ─────────────────────────────────────────────────────────────
Traceback (most recent call last):
File "C:\Miniconda\envs\py311scvi\Lib\site-packages\scvi\model\base_training_mixin.py", line 77, in train
return runner()
^^^^^^^^
File "C:\Miniconda\envs\py311scvi\Lib\site-packages\scvi\train_trainrunner.py", line 72, in call
self.trainer.fit(self.training_plan, self.data_splitter)
File "C:\Miniconda\envs\py311scvi\Lib\site-packages\scvi\train_trainer.py", line 177, in fit
super().fit(*args, **kwargs)
File "C:\Miniconda\envs\py311scvi\Lib\site-packages\pytorch_lightning\trainer\trainer.py", line 460, in fit
self._run(model)
File "C:\Miniconda\envs\py311scvi\Lib\site-packages\pytorch_lightning\trainer\trainer.py", line 715, in _run
self.call_setup_hook(model) # allow user to setup lightning_module in accelerator environment
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Miniconda\envs\py311scvi\Lib\site-packages\pytorch_lightning\trainer\trainer.py", line 1164, in call_setup_hook
self.datamodule.setup(stage=fn)
File "C:\Miniconda\envs\py311scvi\Lib\site-packages\pytorch_lightning\core\datamodule.py", line 376, in wrapped_fn
has_run = getattr(obj, attr)
^^^^^^^^^^^^^^^^^^
AttributeError: 'DataSplitter' object has no attribute '_has_setup_TrainerFn.FITTING'

── R Traceback ──────────────────────────────────────────────────────────────────────────

  1. └─model$train(max_epochs = as.integer(400))
  2. └─reticulate:::py_call_impl(callable, call_args$unnamed, call_args$named)

I can't figure out, could you please give any suggestions? Thanks for your attention.

@hassansaei
Copy link

@hassansaei Hi, sorry to bother you! when I run here:

model$train()

An error occurred as following:

GPU available: False, used: False TPU available: False, using: 0 TPU cores Error in py_call_impl(callable, call_args$unnamed, call_args$named) : AttributeError: 'DataSplitter' object has no attribute '_has_setup_TrainerFn.FITTING'

── Python Exception Message ───────────────────────────────────────────────────────────── Traceback (most recent call last): File "C:\Miniconda\envs\py311scvi\Lib\site-packages\scvi\model\base_training_mixin.py", line 77, in train return runner() ^^^^^^^^ File "C:\Miniconda\envs\py311scvi\Lib\site-packages\scvi\train_trainrunner.py", line 72, in call self.trainer.fit(self.training_plan, self.data_splitter) File "C:\Miniconda\envs\py311scvi\Lib\site-packages\scvi\train_trainer.py", line 177, in fit super().fit(*args, **kwargs) File "C:\Miniconda\envs\py311scvi\Lib\site-packages\pytorch_lightning\trainer\trainer.py", line 460, in fit self._run(model) File "C:\Miniconda\envs\py311scvi\Lib\site-packages\pytorch_lightning\trainer\trainer.py", line 715, in _run self.call_setup_hook(model) # allow user to setup lightning_module in accelerator environment ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Miniconda\envs\py311scvi\Lib\site-packages\pytorch_lightning\trainer\trainer.py", line 1164, in call_setup_hook self.datamodule.setup(stage=fn) File "C:\Miniconda\envs\py311scvi\Lib\site-packages\pytorch_lightning\core\datamodule.py", line 376, in wrapped_fn has_run = getattr(obj, attr) ^^^^^^^^^^^^^^^^^^ AttributeError: 'DataSplitter' object has no attribute '_has_setup_TrainerFn.FITTING'

── R Traceback ────────────────────────────────────────────────────────────────────────── ▆

  1. └─model$train(max_epochs = as.integer(400))
  2. └─reticulate:::py_call_impl(callable, call_args$unnamed, call_args$named)

I can't figure out, could you please give any suggestions? Thanks for your attention.

Hi, what are your installed scvi-tools and PyTorch Lightning versions? Upgrading to the latest scvi-tools (1.0.4) and lightning (2.1.3) might fix the problem.

@redtorrentCN
Copy link

@hassansaei Hi, sorry to bother you! when I run here:
model$train()
An error occurred as following:
GPU available: False, used: False TPU available: False, using: 0 TPU cores Error in py_call_impl(callable, call_args$unnamed, call_args$named) : AttributeError: 'DataSplitter' object has no attribute '_has_setup_TrainerFn.FITTING'
── Python Exception Message ───────────────────────────────────────────────────────────── Traceback (most recent call last): File "C:\Miniconda\envs\py311scvi\Lib\site-packages\scvi\model\base_training_mixin.py", line 77, in train return runner() ^^^^^^^^ File "C:\Miniconda\envs\py311scvi\Lib\site-packages\scvi\train_trainrunner.py", line 72, in call self.trainer.fit(self.training_plan, self.data_splitter) File "C:\Miniconda\envs\py311scvi\Lib\site-packages\scvi\train_trainer.py", line 177, in fit super().fit(*args, **kwargs) File "C:\Miniconda\envs\py311scvi\Lib\site-packages\pytorch_lightning\trainer\trainer.py", line 460, in fit self._run(model) File "C:\Miniconda\envs\py311scvi\Lib\site-packages\pytorch_lightning\trainer\trainer.py", line 715, in _run self.call_setup_hook(model) # allow user to setup lightning_module in accelerator environment ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Miniconda\envs\py311scvi\Lib\site-packages\pytorch_lightning\trainer\trainer.py", line 1164, in call_setup_hook self.datamodule.setup(stage=fn) File "C:\Miniconda\envs\py311scvi\Lib\site-packages\pytorch_lightning\core\datamodule.py", line 376, in wrapped_fn has_run = getattr(obj, attr) ^^^^^^^^^^^^^^^^^^ AttributeError: 'DataSplitter' object has no attribute '_has_setup_TrainerFn.FITTING'
── R Traceback ────────────────────────────────────────────────────────────────────────── ▆

  1. └─model$train(max_epochs = as.integer(400))
  2. └─reticulate:::py_call_impl(callable, call_args$unnamed, call_args$named)

I can't figure out, could you please give any suggestions? Thanks for your attention.

Hi, what are your installed scvi-tools and PyTorch Lightning versions? Upgrading to the latest scvi-tools (1.0.4) and lightning (2.1.3) might fix the problem.

hey, thanks for your answer. I'll try it.

@diala-ar
Copy link

Hi, I reinstalled seurat-wrappers and tried IntegrateLayers with sc-vi and I am getting the below error:

seurs = IntegrateLayers(
object=seurs, method='scVIIntegration',
orig.reduction="pca", new.reduction='integrated.scvi',
conda_env = "/Users/dabdrabb/mambaforge/envs/scvi-env", dims=1:30
)

Error in py_module_import(module, convert = convert) :
ImportError: cannot import name 'get_num_classes' from 'torchmetrics.utilities.data' (/Users/dabdrabb/mambaforge/envs/scvi-env/lib/python3.9/site-packages/torchmetrics/utilities/data.py)
Run reticulate::py_last_error() for details.

To note torchmetrics all its dependencies are installed in my scvi-env. Thanks

sessionInfo()
R version 4.3.2 (2023-10-31)
Platform: aarch64-apple-darwin20 (64-bit)
Running under: macOS Sonoma 14.2.1

Matrix products: default
BLAS: /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
LAPACK: /Library/Frameworks/R.framework/Versions/4.3-arm64/Resources/lib/libRlapack.dylib; LAPACK version 3.11.0

locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8

time zone: America/Toronto
tzcode source: internal

attached base packages:
[1] stats4 stats graphics grDevices datasets utils methods base

other attached packages:
[1] openxlsx_4.2.5.2 dplyr_1.1.4 ggpubr_0.6.0 ggplot2_3.4.4
[5] gprofiler2_0.2.2 purrr_1.0.2 SeuratWrappers_0.3.3 reticulate_1.34.0
[9] harmony_1.2.0 Rcpp_1.0.12 batchelor_1.18.1 SingleCellExperiment_1.24.0
[13] SummarizedExperiment_1.32.0 Biobase_2.62.0 GenomicRanges_1.54.1 GenomeInfoDb_1.38.5
[17] IRanges_2.36.0 S4Vectors_0.40.2 BiocGenerics_0.48.1 MatrixGenerics_1.14.0
[21] matrixStats_1.2.0 SingleR_1.0.1 Seurat_5.0.1 SeuratObject_5.0.1
[25] sp_2.1-2

loaded via a namespace (and not attached):
[1] GSVA_1.51.5 spatstat.sparse_3.0-3 bitops_1.0-7 httr_1.4.7
[5] RColorBrewer_1.1-3 doParallel_1.0.17 tools_4.3.2 sctransform_0.4.1
[9] backports_1.4.1 utf8_1.2.4 R6_2.5.1 ResidualMatrix_1.12.0
[13] HDF5Array_1.30.0 lazyeval_0.2.2 uwot_0.1.16 rhdf5filters_1.14.1
[17] withr_3.0.0 gridExtra_2.3 progressr_0.14.0 cli_3.6.2
[21] spatstat.explore_3.2-5 fastDummies_1.7.3 spatstat.data_3.0-4 ggridges_0.5.5
[25] pbapply_1.7-2 R.utils_2.12.3 parallelly_1.36.0 limma_3.58.1
[29] rstudioapi_0.15.0 RSQLite_2.3.4 generics_0.1.3 ica_1.0-3
[33] spatstat.random_3.2-2 zip_2.3.0 car_3.1-2 Matrix_1.6-5
[37] fansi_1.0.6 abind_1.4-5 R.methodsS3_1.8.2 lifecycle_1.0.4
[41] edgeR_4.0.9 carData_3.0-5 rhdf5_2.46.1 SparseArray_1.2.3
[45] Rtsne_0.17 grid_4.3.2 blob_1.2.4 promises_1.2.1
[49] crayon_1.5.2 miniUI_0.1.1.1 lattice_0.21-9 beachmat_2.18.0
[53] cowplot_1.1.2 annotate_1.80.0 KEGGREST_1.42.0 pillar_1.9.0
[57] future.apply_1.11.1 codetools_0.2-19 leiden_0.4.3.1 glue_1.7.0
[61] doFuture_1.0.1 outliers_0.15 data.table_1.14.10 remotes_2.4.2.1
[65] vctrs_0.6.5 png_0.1-8 spam_2.10-0 gtable_0.3.4
[69] cachem_1.0.8 S4Arrays_1.2.0 mime_0.12 survival_3.5-7
[73] pheatmap_1.0.12 iterators_1.0.14 pbmcapply_1.5.1 statmod_1.5.0
[77] ellipsis_0.3.2 fitdistrplus_1.1-11 ROCR_1.0-11 nlme_3.1-164
[81] bit64_4.0.5 RcppAnnoy_0.0.21 irlba_2.3.5.1 KernSmooth_2.23-22
[85] colorspace_2.1-0 DBI_1.2.1 tidyselect_1.2.0 bit_4.0.5
[89] compiler_4.3.2 graph_1.80.0 BiocNeighbors_1.20.2 DelayedArray_0.28.0
[93] plotly_4.10.4 scales_1.3.0 lmtest_0.9-40 stringr_1.5.1
[97] digest_0.6.34 goftest_1.2-3 spatstat.utils_3.0-4 XVector_0.42.0
[101] htmltools_0.5.7 pkgconfig_2.0.3 sparseMatrixStats_1.14.0 fastmap_1.1.1
[105] rlang_1.1.3 htmlwidgets_1.6.4 shiny_1.8.0 DelayedMatrixStats_1.24.0
[109] zoo_1.8-12 jsonlite_1.8.8 BiocParallel_1.36.0 R.oo_1.25.0
[113] BiocSingular_1.18.0 RCurl_1.98-1.14 magrittr_2.0.3 scuttle_1.12.0
[117] GenomeInfoDbData_1.2.11 dotCall64_1.1-1 patchwork_1.2.0 Rhdf5lib_1.24.1
[121] munsell_0.5.0 stringi_1.8.3 zlibbioc_1.48.0 MASS_7.3-60.0.1
[125] plyr_1.8.9 parallel_4.3.2 listenv_0.9.0 ggrepel_0.9.5
[129] deldir_2.0-2 Biostrings_2.70.1 splines_4.3.2 tensor_1.5
[133] locfit_1.5-9.8 igraph_1.6.0 spatstat.geom_3.2-7 ggsignif_0.6.4
[137] RcppHNSW_0.5.0 reshape2_1.4.4 ScaledMatrix_1.10.0 XML_3.99-0.16
[141] renv_1.0.3 BiocManager_1.30.22 foreach_1.5.2 httpuv_1.6.13
[145] RANN_2.6.1 tidyr_1.3.0 polyclip_1.10-6 future_1.33.1
[149] scattermore_1.2 rsvd_1.0.5 broom_1.0.5 xtable_1.8-4
[153] RSpectra_0.16-1 rstatix_0.7.2 later_1.3.2 viridisLite_0.4.2
[157] singscore_1.22.0 tibble_3.2.1 memoise_2.0.1 AnnotationDbi_1.64.1
[161] cluster_2.1.6 globals_0.16.2 GSEABase_1.64.0

@ggruenhagen3
Copy link

@joaolsf @ishwarvh I got the same error saying "None of the features provided are found in this assay". I was able to solve this and successfully integrate using SCTransformed data. My solution was to set assay = "SCT" and features = VariableFeatures(obj[["SCT"]]). This is in Seurat_5.0.0.

@alicekao1118
Copy link

Hi @ggruenhagen3 ,

I ran into an error while performing IntegrateLayers() on SCTranformed data. I followed your suggestion, but still got the error. Below is my code. I am under Seurat_5.0.1

samples.merge <- IntegrateLayers(
  object = samples.merge, method = scVIIntegration,
  assay="SCT",
  scale.layer = "scale.data",
  new.reduction = "integrated.scvi",
  features = VariableFeatures(samples.merge[["SCT"]]),
  conda_env = "/PATH/conda", verbose = T
)
Error in UseMethod(generic = "JoinLayers", object = object) : 
  no applicable method for 'JoinLayers' applied to an object of class "c('SCTAssay', 'Assay', 'KeyMixin')"

Thanks,
Alice

@GreenGilad
Copy link
Contributor

@alicekao1118 I had the same issue. I've made a pull request to fix this problem. While waiting for it to be merge you can install directly like this:

remotes::install_github(repo="satijalab/seurat-wrappers", ref = remotes::github_pull(184))

@artsvendsen
Copy link

@alicekao1118 I had the same issue. I've made a pull request to fix this problem. While waiting for it to be merge you can install directly like this:

remotes::install_github(repo="satijalab/seurat-wrappers", ref = remotes::github_pull(184))

It works perfectly with SCT transformed data now, thanks @GreenGilad!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests