Multi Speaker Context Aware Podcast Generation
agentsllmsvector-databaselancedbgptopenaiAImultimodal-aimachine-learningembeddingsfine-tuningMulti_Speaker_Context_Aware_Podcast_Generationexamplesdeep-learninggpt-4-visionllama-indexragmultimodallangchainlancedb-recipes
Export
Multi Speaker & Context Aware AI Podcast Generation
Hi everyone, welcome to this notebook!
Today, we'll be building an AI-powered podcast that not only converts my Medium blog into a podcast but also incorporates insights from my previous work on related topics. This approach makes the podcast feel more natural and enhances trust in AI-generated content.
Tech Stack Used:
- LangChain
- ElevenLabs (for TTS)
- Gemini (LLM)
- LanceDB (for context retrieval)
How to use this notebook? If you want to generate a podcast with my configuration, you simply need to replace sample blog text used to generate podcast with you blog. If you want to modify and change voices or models used or number of speakers in the podcast, please change configuration at multiple steps as explained in the blog.
Let's go.
Install Necessary Libraries and Dataset
[27]
Collecting feedparser Downloading feedparser-6.0.11-py3-none-any.whl.metadata (2.4 kB) Collecting sgmllib3k (from feedparser) Downloading sgmllib3k-1.0.0.tar.gz (5.8 kB) Preparing metadata (setup.py) ... done Downloading feedparser-6.0.11-py3-none-any.whl (81 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 81.3/81.3 kB 2.8 MB/s eta 0:00:00 Building wheels for collected packages: sgmllib3k Building wheel for sgmllib3k (setup.py) ... done Created wheel for sgmllib3k: filename=sgmllib3k-1.0.0-py3-none-any.whl size=6047 sha256=b25b87ad57a90567f78a495af6e139a97b0917a85c1b21ca8dc6b3cbaa358576 Stored in directory: /root/.cache/pip/wheels/3b/25/2a/105d6a15df6914f4d15047691c6c28f9052cc1173e40285d03 Successfully built sgmllib3k Installing collected packages: sgmllib3k, feedparser Successfully installed feedparser-6.0.11 sgmllib3k-1.0.0 Collecting tantivy Downloading tantivy-0.22.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (1.2 kB) Downloading tantivy-0.22.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.5 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.5/4.5 MB 19.4 MB/s eta 0:00:00 Installing collected packages: tantivy Successfully installed tantivy-0.22.0 Collecting lancedb Downloading lancedb-0.21.1-cp39-abi3-manylinux_2_28_x86_64.whl.metadata (4.1 kB) Collecting deprecation (from lancedb) Downloading deprecation-2.1.0-py2.py3-none-any.whl.metadata (4.6 kB) Requirement already satisfied: tqdm>=4.27.0 in /usr/local/lib/python3.11/dist-packages (from lancedb) (4.67.1) Requirement already satisfied: pyarrow>=14 in /usr/local/lib/python3.11/dist-packages (from lancedb) (18.1.0) Requirement already satisfied: pydantic>=1.10 in /usr/local/lib/python3.11/dist-packages (from lancedb) (2.10.6) Requirement already satisfied: packaging in /usr/local/lib/python3.11/dist-packages (from lancedb) (24.2) Collecting overrides>=0.7 (from lancedb) Downloading overrides-7.7.0-py3-none-any.whl.metadata (5.8 kB) Collecting pylance>=0.23.2 (from lancedb) Downloading pylance-0.24.1-cp39-abi3-manylinux_2_28_x86_64.whl.metadata (7.2 kB) Requirement already satisfied: annotated-types>=0.6.0 in /usr/local/lib/python3.11/dist-packages (from pydantic>=1.10->lancedb) (0.7.0) Requirement already satisfied: pydantic-core==2.27.2 in /usr/local/lib/python3.11/dist-packages (from pydantic>=1.10->lancedb) (2.27.2) Requirement already satisfied: typing-extensions>=4.12.2 in /usr/local/lib/python3.11/dist-packages (from pydantic>=1.10->lancedb) (4.12.2) Requirement already satisfied: numpy>=1.22 in /usr/local/lib/python3.11/dist-packages (from pylance>=0.23.2->lancedb) (2.0.2) Downloading lancedb-0.21.1-cp39-abi3-manylinux_2_28_x86_64.whl (33.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 33.2/33.2 MB 14.4 MB/s eta 0:00:00 Downloading overrides-7.7.0-py3-none-any.whl (17 kB) Downloading pylance-0.24.1-cp39-abi3-manylinux_2_28_x86_64.whl (36.8 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 36.8/36.8 MB 10.3 MB/s eta 0:00:00 Downloading deprecation-2.1.0-py2.py3-none-any.whl (11 kB) Installing collected packages: pylance, overrides, deprecation, lancedb Successfully installed deprecation-2.1.0 lancedb-0.21.1 overrides-7.7.0 pylance-0.24.1 Requirement already satisfied: sentence-transformers in /usr/local/lib/python3.11/dist-packages (3.4.1) Requirement already satisfied: transformers<5.0.0,>=4.41.0 in /usr/local/lib/python3.11/dist-packages (from sentence-transformers) (4.48.3) Requirement already satisfied: tqdm in /usr/local/lib/python3.11/dist-packages (from sentence-transformers) (4.67.1) Requirement already satisfied: torch>=1.11.0 in /usr/local/lib/python3.11/dist-packages (from sentence-transformers) (2.6.0+cu124) Requirement already satisfied: scikit-learn in /usr/local/lib/python3.11/dist-packages (from sentence-transformers) (1.6.1) Requirement already satisfied: scipy in /usr/local/lib/python3.11/dist-packages (from sentence-transformers) (1.14.1) Requirement already satisfied: huggingface-hub>=0.20.0 in /usr/local/lib/python3.11/dist-packages (from sentence-transformers) (0.28.1) Requirement already satisfied: Pillow in /usr/local/lib/python3.11/dist-packages (from sentence-transformers) (11.1.0) Requirement already satisfied: filelock in /usr/local/lib/python3.11/dist-packages (from huggingface-hub>=0.20.0->sentence-transformers) (3.17.0) Requirement already satisfied: fsspec>=2023.5.0 in /usr/local/lib/python3.11/dist-packages (from huggingface-hub>=0.20.0->sentence-transformers) (2024.10.0) Requirement already satisfied: packaging>=20.9 in /usr/local/lib/python3.11/dist-packages (from huggingface-hub>=0.20.0->sentence-transformers) (24.2) Requirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.11/dist-packages (from huggingface-hub>=0.20.0->sentence-transformers) (6.0.2) Requirement already satisfied: requests in /usr/local/lib/python3.11/dist-packages (from huggingface-hub>=0.20.0->sentence-transformers) (2.32.3) Requirement already satisfied: typing-extensions>=3.7.4.3 in /usr/local/lib/python3.11/dist-packages (from huggingface-hub>=0.20.0->sentence-transformers) (4.12.2) Requirement already satisfied: networkx in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers) (3.4.2) Requirement already satisfied: jinja2 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers) (3.1.6) Collecting nvidia-cuda-nvrtc-cu12==12.4.127 (from torch>=1.11.0->sentence-transformers) Downloading nvidia_cuda_nvrtc_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB) Collecting nvidia-cuda-runtime-cu12==12.4.127 (from torch>=1.11.0->sentence-transformers) Downloading nvidia_cuda_runtime_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB) Collecting nvidia-cuda-cupti-cu12==12.4.127 (from torch>=1.11.0->sentence-transformers) Downloading nvidia_cuda_cupti_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl.metadata (1.6 kB) Collecting nvidia-cudnn-cu12==9.1.0.70 (from torch>=1.11.0->sentence-transformers) Downloading nvidia_cudnn_cu12-9.1.0.70-py3-none-manylinux2014_x86_64.whl.metadata (1.6 kB) Collecting nvidia-cublas-cu12==12.4.5.8 (from torch>=1.11.0->sentence-transformers) Downloading nvidia_cublas_cu12-12.4.5.8-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB) Collecting nvidia-cufft-cu12==11.2.1.3 (from torch>=1.11.0->sentence-transformers) Downloading nvidia_cufft_cu12-11.2.1.3-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB) Collecting nvidia-curand-cu12==10.3.5.147 (from torch>=1.11.0->sentence-transformers) Downloading nvidia_curand_cu12-10.3.5.147-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB) Collecting nvidia-cusolver-cu12==11.6.1.9 (from torch>=1.11.0->sentence-transformers) Downloading nvidia_cusolver_cu12-11.6.1.9-py3-none-manylinux2014_x86_64.whl.metadata (1.6 kB) Collecting nvidia-cusparse-cu12==12.3.1.170 (from torch>=1.11.0->sentence-transformers) Downloading nvidia_cusparse_cu12-12.3.1.170-py3-none-manylinux2014_x86_64.whl.metadata (1.6 kB) Requirement already satisfied: nvidia-cusparselt-cu12==0.6.2 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers) (0.6.2) Requirement already satisfied: nvidia-nccl-cu12==2.21.5 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers) (2.21.5) Requirement already satisfied: nvidia-nvtx-cu12==12.4.127 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers) (12.4.127) Collecting nvidia-nvjitlink-cu12==12.4.127 (from torch>=1.11.0->sentence-transformers) Downloading nvidia_nvjitlink_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB) Requirement already satisfied: triton==3.2.0 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers) (3.2.0) Requirement already satisfied: sympy==1.13.1 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers) (1.13.1) Requirement already satisfied: mpmath<1.4,>=1.1.0 in /usr/local/lib/python3.11/dist-packages (from sympy==1.13.1->torch>=1.11.0->sentence-transformers) (1.3.0) Requirement already satisfied: numpy>=1.17 in /usr/local/lib/python3.11/dist-packages (from transformers<5.0.0,>=4.41.0->sentence-transformers) (2.0.2) Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.11/dist-packages (from transformers<5.0.0,>=4.41.0->sentence-transformers) (2024.11.6) Requirement already satisfied: tokenizers<0.22,>=0.21 in /usr/local/lib/python3.11/dist-packages (from transformers<5.0.0,>=4.41.0->sentence-transformers) (0.21.1) Requirement already satisfied: safetensors>=0.4.1 in /usr/local/lib/python3.11/dist-packages (from transformers<5.0.0,>=4.41.0->sentence-transformers) (0.5.3) Requirement already satisfied: joblib>=1.2.0 in /usr/local/lib/python3.11/dist-packages (from scikit-learn->sentence-transformers) (1.4.2) Requirement already satisfied: threadpoolctl>=3.1.0 in /usr/local/lib/python3.11/dist-packages (from scikit-learn->sentence-transformers) (3.6.0) Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.11/dist-packages (from jinja2->torch>=1.11.0->sentence-transformers) (3.0.2) Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.11/dist-packages (from requests->huggingface-hub>=0.20.0->sentence-transformers) (3.4.1) Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.11/dist-packages (from requests->huggingface-hub>=0.20.0->sentence-transformers) (3.10) Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.11/dist-packages (from requests->huggingface-hub>=0.20.0->sentence-transformers) (2.3.0) Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.11/dist-packages (from requests->huggingface-hub>=0.20.0->sentence-transformers) (2025.1.31) Downloading nvidia_cublas_cu12-12.4.5.8-py3-none-manylinux2014_x86_64.whl (363.4 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 363.4/363.4 MB 1.2 MB/s eta 0:00:00 Downloading nvidia_cuda_cupti_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl (13.8 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 13.8/13.8 MB 6.0 MB/s eta 0:00:00 Downloading nvidia_cuda_nvrtc_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl (24.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 24.6/24.6 MB 6.8 MB/s eta 0:00:00 Downloading nvidia_cuda_runtime_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl (883 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 883.7/883.7 kB 9.1 MB/s eta 0:00:00 Downloading nvidia_cudnn_cu12-9.1.0.70-py3-none-manylinux2014_x86_64.whl (664.8 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 664.8/664.8 MB 723.9 kB/s eta 0:00:00 Downloading nvidia_cufft_cu12-11.2.1.3-py3-none-manylinux2014_x86_64.whl (211.5 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 211.5/211.5 MB 2.5 MB/s eta 0:00:00 Downloading nvidia_curand_cu12-10.3.5.147-py3-none-manylinux2014_x86_64.whl (56.3 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 56.3/56.3 MB 5.6 MB/s eta 0:00:00 Downloading nvidia_cusolver_cu12-11.6.1.9-py3-none-manylinux2014_x86_64.whl (127.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 127.9/127.9 MB 4.7 MB/s eta 0:00:00 Downloading nvidia_cusparse_cu12-12.3.1.170-py3-none-manylinux2014_x86_64.whl (207.5 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 207.5/207.5 MB 2.8 MB/s eta 0:00:00 Downloading nvidia_nvjitlink_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl (21.1 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 21.1/21.1 MB 11.9 MB/s eta 0:00:00 Installing collected packages: nvidia-nvjitlink-cu12, nvidia-curand-cu12, nvidia-cufft-cu12, nvidia-cuda-runtime-cu12, nvidia-cuda-nvrtc-cu12, nvidia-cuda-cupti-cu12, nvidia-cublas-cu12, nvidia-cusparse-cu12, nvidia-cudnn-cu12, nvidia-cusolver-cu12 Attempting uninstall: nvidia-nvjitlink-cu12 Found existing installation: nvidia-nvjitlink-cu12 12.5.82 Uninstalling nvidia-nvjitlink-cu12-12.5.82: Successfully uninstalled nvidia-nvjitlink-cu12-12.5.82 Attempting uninstall: nvidia-curand-cu12 Found existing installation: nvidia-curand-cu12 10.3.6.82 Uninstalling nvidia-curand-cu12-10.3.6.82: Successfully uninstalled nvidia-curand-cu12-10.3.6.82 Attempting uninstall: nvidia-cufft-cu12 Found existing installation: nvidia-cufft-cu12 11.2.3.61 Uninstalling nvidia-cufft-cu12-11.2.3.61: Successfully uninstalled nvidia-cufft-cu12-11.2.3.61 Attempting uninstall: nvidia-cuda-runtime-cu12 Found existing installation: nvidia-cuda-runtime-cu12 12.5.82 Uninstalling nvidia-cuda-runtime-cu12-12.5.82: Successfully uninstalled nvidia-cuda-runtime-cu12-12.5.82 Attempting uninstall: nvidia-cuda-nvrtc-cu12 Found existing installation: nvidia-cuda-nvrtc-cu12 12.5.82 Uninstalling nvidia-cuda-nvrtc-cu12-12.5.82: Successfully uninstalled nvidia-cuda-nvrtc-cu12-12.5.82 Attempting uninstall: nvidia-cuda-cupti-cu12 Found existing installation: nvidia-cuda-cupti-cu12 12.5.82 Uninstalling nvidia-cuda-cupti-cu12-12.5.82: Successfully uninstalled nvidia-cuda-cupti-cu12-12.5.82 Attempting uninstall: nvidia-cublas-cu12 Found existing installation: nvidia-cublas-cu12 12.5.3.2 Uninstalling nvidia-cublas-cu12-12.5.3.2: Successfully uninstalled nvidia-cublas-cu12-12.5.3.2 Attempting uninstall: nvidia-cusparse-cu12 Found existing installation: nvidia-cusparse-cu12 12.5.1.3 Uninstalling nvidia-cusparse-cu12-12.5.1.3: Successfully uninstalled nvidia-cusparse-cu12-12.5.1.3 Attempting uninstall: nvidia-cudnn-cu12 Found existing installation: nvidia-cudnn-cu12 9.3.0.75 Uninstalling nvidia-cudnn-cu12-9.3.0.75: Successfully uninstalled nvidia-cudnn-cu12-9.3.0.75 Attempting uninstall: nvidia-cusolver-cu12 Found existing installation: nvidia-cusolver-cu12 11.6.3.83 Uninstalling nvidia-cusolver-cu12-11.6.3.83: Successfully uninstalled nvidia-cusolver-cu12-11.6.3.83 Successfully installed nvidia-cublas-cu12-12.4.5.8 nvidia-cuda-cupti-cu12-12.4.127 nvidia-cuda-nvrtc-cu12-12.4.127 nvidia-cuda-runtime-cu12-12.4.127 nvidia-cudnn-cu12-9.1.0.70 nvidia-cufft-cu12-11.2.1.3 nvidia-curand-cu12-10.3.5.147 nvidia-cusolver-cu12-11.6.1.9 nvidia-cusparse-cu12-12.3.1.170 nvidia-nvjitlink-cu12-12.4.127 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.5/2.5 MB 29.3 MB/s eta 0:00:00 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 40.2/40.2 kB 2.8 MB/s eta 0:00:00 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.4/1.4 MB 48.2 MB/s eta 0:00:00 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 50.9/50.9 kB 3.9 MB/s eta 0:00:00 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. google-generativeai 0.8.4 requires google-ai-generativelanguage==0.6.15, but you have google-ai-generativelanguage 0.6.17 which is incompatible. W: Skipping acquire of configured file 'main/source/Sources' as repository 'https://r2u.stat.illinois.edu/ubuntu jammy InRelease' does not seem to provide it (sources.list entry misspelt?)
[28]
Collecting keybert Downloading keybert-0.9.0-py3-none-any.whl.metadata (15 kB) Requirement already satisfied: numpy>=1.18.5 in /usr/local/lib/python3.11/dist-packages (from keybert) (2.0.2) Requirement already satisfied: rich>=10.4.0 in /usr/local/lib/python3.11/dist-packages (from keybert) (13.9.4) Requirement already satisfied: scikit-learn>=0.22.2 in /usr/local/lib/python3.11/dist-packages (from keybert) (1.6.1) Requirement already satisfied: sentence-transformers>=0.3.8 in /usr/local/lib/python3.11/dist-packages (from keybert) (3.4.1) Requirement already satisfied: markdown-it-py>=2.2.0 in /usr/local/lib/python3.11/dist-packages (from rich>=10.4.0->keybert) (3.0.0) Requirement already satisfied: pygments<3.0.0,>=2.13.0 in /usr/local/lib/python3.11/dist-packages (from rich>=10.4.0->keybert) (2.18.0) Requirement already satisfied: scipy>=1.6.0 in /usr/local/lib/python3.11/dist-packages (from scikit-learn>=0.22.2->keybert) (1.14.1) Requirement already satisfied: joblib>=1.2.0 in /usr/local/lib/python3.11/dist-packages (from scikit-learn>=0.22.2->keybert) (1.4.2) Requirement already satisfied: threadpoolctl>=3.1.0 in /usr/local/lib/python3.11/dist-packages (from scikit-learn>=0.22.2->keybert) (3.6.0) Requirement already satisfied: transformers<5.0.0,>=4.41.0 in /usr/local/lib/python3.11/dist-packages (from sentence-transformers>=0.3.8->keybert) (4.48.3) Requirement already satisfied: tqdm in /usr/local/lib/python3.11/dist-packages (from sentence-transformers>=0.3.8->keybert) (4.67.1) Requirement already satisfied: torch>=1.11.0 in /usr/local/lib/python3.11/dist-packages (from sentence-transformers>=0.3.8->keybert) (2.6.0+cu124) Requirement already satisfied: huggingface-hub>=0.20.0 in /usr/local/lib/python3.11/dist-packages (from sentence-transformers>=0.3.8->keybert) (0.28.1) Requirement already satisfied: Pillow in /usr/local/lib/python3.11/dist-packages (from sentence-transformers>=0.3.8->keybert) (11.1.0) Requirement already satisfied: filelock in /usr/local/lib/python3.11/dist-packages (from huggingface-hub>=0.20.0->sentence-transformers>=0.3.8->keybert) (3.17.0) Requirement already satisfied: fsspec>=2023.5.0 in /usr/local/lib/python3.11/dist-packages (from huggingface-hub>=0.20.0->sentence-transformers>=0.3.8->keybert) (2024.10.0) Requirement already satisfied: packaging>=20.9 in /usr/local/lib/python3.11/dist-packages (from huggingface-hub>=0.20.0->sentence-transformers>=0.3.8->keybert) (24.2) Requirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.11/dist-packages (from huggingface-hub>=0.20.0->sentence-transformers>=0.3.8->keybert) (6.0.2) Requirement already satisfied: requests in /usr/local/lib/python3.11/dist-packages (from huggingface-hub>=0.20.0->sentence-transformers>=0.3.8->keybert) (2.32.3) Requirement already satisfied: typing-extensions>=3.7.4.3 in /usr/local/lib/python3.11/dist-packages (from huggingface-hub>=0.20.0->sentence-transformers>=0.3.8->keybert) (4.12.2) Requirement already satisfied: mdurl~=0.1 in /usr/local/lib/python3.11/dist-packages (from markdown-it-py>=2.2.0->rich>=10.4.0->keybert) (0.1.2) Requirement already satisfied: networkx in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (3.4.2) Requirement already satisfied: jinja2 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (3.1.6) Requirement already satisfied: nvidia-cuda-nvrtc-cu12==12.4.127 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (12.4.127) Requirement already satisfied: nvidia-cuda-runtime-cu12==12.4.127 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (12.4.127) Requirement already satisfied: nvidia-cuda-cupti-cu12==12.4.127 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (12.4.127) Requirement already satisfied: nvidia-cudnn-cu12==9.1.0.70 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (9.1.0.70) Requirement already satisfied: nvidia-cublas-cu12==12.4.5.8 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (12.4.5.8) Requirement already satisfied: nvidia-cufft-cu12==11.2.1.3 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (11.2.1.3) Requirement already satisfied: nvidia-curand-cu12==10.3.5.147 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (10.3.5.147) Requirement already satisfied: nvidia-cusolver-cu12==11.6.1.9 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (11.6.1.9) Requirement already satisfied: nvidia-cusparse-cu12==12.3.1.170 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (12.3.1.170) Requirement already satisfied: nvidia-cusparselt-cu12==0.6.2 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (0.6.2) Requirement already satisfied: nvidia-nccl-cu12==2.21.5 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (2.21.5) Requirement already satisfied: nvidia-nvtx-cu12==12.4.127 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (12.4.127) Requirement already satisfied: nvidia-nvjitlink-cu12==12.4.127 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (12.4.127) Requirement already satisfied: triton==3.2.0 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (3.2.0) Requirement already satisfied: sympy==1.13.1 in /usr/local/lib/python3.11/dist-packages (from torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (1.13.1) Requirement already satisfied: mpmath<1.4,>=1.1.0 in /usr/local/lib/python3.11/dist-packages (from sympy==1.13.1->torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (1.3.0) Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.11/dist-packages (from transformers<5.0.0,>=4.41.0->sentence-transformers>=0.3.8->keybert) (2024.11.6) Requirement already satisfied: tokenizers<0.22,>=0.21 in /usr/local/lib/python3.11/dist-packages (from transformers<5.0.0,>=4.41.0->sentence-transformers>=0.3.8->keybert) (0.21.1) Requirement already satisfied: safetensors>=0.4.1 in /usr/local/lib/python3.11/dist-packages (from transformers<5.0.0,>=4.41.0->sentence-transformers>=0.3.8->keybert) (0.5.3) Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.11/dist-packages (from jinja2->torch>=1.11.0->sentence-transformers>=0.3.8->keybert) (3.0.2) Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.11/dist-packages (from requests->huggingface-hub>=0.20.0->sentence-transformers>=0.3.8->keybert) (3.4.1) Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.11/dist-packages (from requests->huggingface-hub>=0.20.0->sentence-transformers>=0.3.8->keybert) (3.10) Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.11/dist-packages (from requests->huggingface-hub>=0.20.0->sentence-transformers>=0.3.8->keybert) (2.3.0) Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.11/dist-packages (from requests->huggingface-hub>=0.20.0->sentence-transformers>=0.3.8->keybert) (2025.1.31) Downloading keybert-0.9.0-py3-none-any.whl (41 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 41.4/41.4 kB 1.6 MB/s eta 0:00:00 Installing collected packages: keybert Successfully installed keybert-0.9.0 --2025-03-18 15:07:58-- https://raw.githubusercontent.com/shuklaji28/vectordb-recipes/main/examples/Multi_Speaker_Context_Aware_Podcast_Generation/urls.json Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.108.133, 185.199.109.133, 185.199.110.133, ... Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.108.133|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 1127 (1.1K) [text/plain] Saving to: ‘urls.json’ urls.json 100%[===================>] 1.10K --.-KB/s in 0s 2025-03-18 15:07:59 (42.6 MB/s) - ‘urls.json’ saved [1127/1127]
[59]
Helper Functions
[58]
Set-Up Configurations
[31]
modules.json: 0%| | 0.00/349 [00:00<?, ?B/s]
config_sentence_transformers.json: 0%| | 0.00/116 [00:00<?, ?B/s]
README.md: 0%| | 0.00/10.5k [00:00<?, ?B/s]
sentence_bert_config.json: 0%| | 0.00/53.0 [00:00<?, ?B/s]
config.json: 0%| | 0.00/612 [00:00<?, ?B/s]
model.safetensors: 0%| | 0.00/90.9M [00:00<?, ?B/s]
tokenizer_config.json: 0%| | 0.00/350 [00:00<?, ?B/s]
vocab.txt: 0%| | 0.00/232k [00:00<?, ?B/s]
tokenizer.json: 0%| | 0.00/466k [00:00<?, ?B/s]
special_tokens_map.json: 0%| | 0.00/112 [00:00<?, ?B/s]
config.json: 0%| | 0.00/190 [00:00<?, ?B/s]
Creating LanceDB Cloud Vector Store + Experimenting with Search
[32]
['https://uselessai.in/how-to-create-spark-job-definitions-in-fabric-everything-you-need-to-know-to-start-0fb75b5ca432?sk=e6e62043c123f6cc805728c068bdbd17', 'https://uselessai.in/fabric-introduced-task-flows-how-to-start-building-with-microsoft-fabric-2508800a55af?sk=ccb029f394ff3c646fb283cab0847bf9', 'https://uselessai.in/using-for-loop-in-fabric-data-factory-for-parallel-processing-the-better-way-c5a884356a50?sk=eea7ae44cdd443493fcce40e4915bc32', 'https://uselessai.in/connecting-fabric-workspace-with-azure-blob-storage-trusted-workspace-connection-for-production-9f7c24a66d1b?sk=ffc921c05f711fc5b223ce48df4f1e85', 'https://theshresthshukla.medium.com/how-to-maintain-sanity-between-dev-stg-prod-in-fabric-tracking-changes-via-deployment-pipeline-984cb201f5d2?sk=9347f57e88d8beed2fc7783a6588998f', 'https://uselessai.in/microsoft-fabric-warehouse-deployment-issue-s-and-potential-solution-s-9ad360411f7a?sk=8ef1c60aa05023d443a5d00212cd4198', 'https://uselessai.in/microsoft-fabric-stored-procedure-not-reflecting-warehouse-connection-change-in-data-factory-e0dc96e6ce83']
[33]
[34]
{'title': 'How to create SPARK JOB definitions in Fabric? — Everything you need to know to start', 'headings': ['UselessAI.in', 'Notebook vs Spark Job Definition', 'How to create main and reference job files for Spark Job definition activity in Fabric?', 'Published in UselessAI.in', 'Written by Shresth Shukla', 'No responses yet'], 'content': 'Sign up\nSign in\nSign up\nSign in\nHome\nLibrary\nStories\nStats\nHome\nNewsletter\nAbout\nFollow publication\nWE READ. WE BUILD. \u200a—\u200aLearning AI by reading and building.\nFollow publication\nMember-only story\nShresth Shukla\nFollow\nUselessAI.in\n--\nShare\nNote — My blogs are 100% free to read. Stuck behind paywall? CLICK HERE to read it for free.\nIf you are reading this, I’m sure you have explored other content on this topic but didn’t find anything fruitful. Things can be quite confusing when it comes to creating Spark job definitions in Fabric — not just because it’s complex, but also because people generally prefer notebooks for processing these days.\nVery few people are working with Spark job definitions for data processing in Fabric. Maybe it’s still too early for them, or they are still exploring.\nToday, we’ll see how we can use Spark job definitions to process big data and how they offer a unique advantage over notebooks. Let’s go!\nI assume you’ve worked with Fabric notebooks. It’s easy, right? You simply write your code, perform operations, and maybe write data to tables. The advantage of using notebooks is that they…\n--\n--\nWE READ. WE BUILD. \u200a—\u200aLearning AI by reading and building.\nData & AI @ EY, Freelance Technical Writer (AI/ML/GenAI Domain) MTech @BITS Pilani Maths Hons @DU\nHelp\nStatus\nAbout\nCareers\nPress\nBlog\nPrivacy\nTerms\nText to speech\nTeams', 'url': 'https://uselessai.in/how-to-create-spark-job-definitions-in-fabric-everything-you-need-to-know-to-start-0fb75b5ca432?sk=e6e62043c123f6cc805728c068bdbd17'}
{'title': 'Fabric Introduced Task Flows— How to start building with Microsoft Fabric?', 'headings': ['UselessAI.in', 'Including my experience with Deployment Pipelines', 'How do we start building using Microsoft Fabric with pre-defined task flows?', 'What’s new?', 'Published in UselessAI.in', 'Written by Shresth Shukla', 'Responses (1)'], 'content': 'Sign up\nSign in\nSign up\nSign in\nHome\nLibrary\nStories\nStats\nHome\nNewsletter\nAbout\nFollow publication\nWE READ. WE BUILD. \u200a—\u200aLearning AI by reading and building.\nFollow publication\nMember-only story\nShresth Shukla\nFollow\nUselessAI.in\n--\n1\nShare\nNote — My blogs are 100% free to read. Stuck behind paywall? CLICK HERE to read it for FREE.\nHi everybody, welcome to this Fabric Series on UselessAI.in! Guess what? Fabric is upgrading slowly. Most of the bugs I raised have now been solved. I had a conversation with Fabric Support, where I informed them about a few bugs, and they were eventually resolved, while some are still in the process of being fixed. Feels good that the product is improving with each passing day :)\nIf you are new to Fabric or want better management of your workflow on Microsoft Fabric, this blog is for you. Fabric just got upgraded, and it has some new updates. I mean, there are many updates, but I’ll discuss a good one with you that will help you organize things in your Fabric Workspace and improve your overall project development.\nRemember how we design a high-level architecture where we finalize the flow of the project? For example, how do you think a data project would look? We gather data — it could be from cloud sources or other databases — perform transformations on it, use PySpark Notebooks, write job definitions, and maybe build dashboards or AI…\n--\n--\n1\nWE READ. WE BUILD. \u200a—\u200aLearning AI by reading and building.\nData & AI @ EY, Freelance Technical Writer (AI/ML/GenAI Domain) MTech @BITS Pilani Maths Hons @DU\nHelp\nStatus\nAbout\nCareers\nPress\nBlog\nPrivacy\nTerms\nText to speech\nTeams', 'url': 'https://uselessai.in/fabric-introduced-task-flows-how-to-start-building-with-microsoft-fabric-2508800a55af?sk=ccb029f394ff3c646fb283cab0847bf9'}
{'title': 'Using For-Loop in Fabric Data Factory for Parallel Processing — the better way', 'headings': ['UselessAI.in', 'How I reduced my Data Factory Pipeline processing time by 70-75%', 'A better optimization hack — Skip using set-variable activity or invoking another pipeline for parallel processing', 'How to design Data Factory pipeline for parallel processing in Fabric? : Microsoft Fabric', 'Parallel processing with “Set Variable” Activity — Fixed', 'Don’t use “Set Variable” activity in Data Factory Pipeline: Microsoft Fabric', 'How not-to use “Set Variable” activity in Fabric? (with Solution) — Fixing “Set Variable” bug on Microsoft Fabric', 'Can we do this processing without invoking another pipeline?', 'Published in UselessAI.in', 'Written by Shresth Shukla', 'Responses (1)'], 'content': 'Sign up\nSign in\nSign up\nSign in\nHome\nLibrary\nStories\nStats\nHome\nNewsletter\nAbout\nFollow publication\nWE READ. WE BUILD. \u200a—\u200aLearning AI by reading and building.\nFollow publication\nMember-only story\nShresth Shukla\nFollow\nUselessAI.in\n--\n1\nShare\nNote — My blogs are 100% free to read. Stuck behind paywall? Read it for free. Click Here!\nDo you remember how we found some issues while using the set variable activity in Fabric and then encountered problems with that approach? Then we figured out a better way of using a for loop with the set variable activity by introducing another activity — invoke pipeline. Well, it solves the problem, but we have an even better way to do this. Let’s see. Check out the other blogs on this same topic here —\nuselessai.in\nuselessai.in\n--\n--\n1\nWE READ. WE BUILD. \u200a—\u200aLearning AI by reading and building.\nData & AI @ EY, Freelance Technical Writer (AI/ML/GenAI Domain) MTech @BITS Pilani Maths Hons @DU\nHelp\nStatus\nAbout\nCareers\nPress\nBlog\nPrivacy\nTerms\nText to speech\nTeams', 'url': 'https://uselessai.in/using-for-loop-in-fabric-data-factory-for-parallel-processing-the-better-way-c5a884356a50?sk=eea7ae44cdd443493fcce40e4915bc32'}
{'title': 'Connecting Fabric Workspace with Azure Blob Storage— Trusted Workspace Connection for Production', 'headings': ['UselessAI.in', 'What is the secured way of connecting Fabric with Azure Blob?', 'What are the different ways to connect our workspace with Azure Blob for use in sub-components?', 'Published in UselessAI.in', 'Written by Shresth Shukla', 'Responses (1)'], 'content': 'Sign up\nSign in\nSign up\nSign in\nHome\nLibrary\nStories\nStats\nHome\nNewsletter\nAbout\nFollow publication\nWE READ. WE BUILD. \u200a—\u200aLearning AI by reading and building.\nFollow publication\nMember-only story\nFeatured\nShresth Shukla\nFollow\nUselessAI.in\n--\n1\nShare\nNote — My blogs are 100% free to read. Stuck behind paywall? Read this blog for free. Click Here\nHi all, this is an interesting topic that many will find useful when deploying their solution across multiple stages of the development lifecycle on Microsoft Fabric.\nFabric Workspace allows you to connect with Azure Blob in multiple ways. You can use this connection for different purposes. One way is to use it directly inside a notebook by mounting the Blob Storage as a shortcut, allowing you to access the data seamlessly.\nAnother way to use Azure Blob in Fabric is when building a Data Factory pipeline, where copy activities move data from Blob Storage to Fabric OneLake. But what if your storage is secured behind a firewall?\nWe’ll explore both methods of connecting Blob Storage with Fabric, along with some unique insights that you won’t easily find online. Or maybe you will xd.\n--\n--\n1\nWE READ. WE BUILD. \u200a—\u200aLearning AI by reading and building.\nData & AI @ EY, Freelance Technical Writer (AI/ML/GenAI Domain) MTech @BITS Pilani Maths Hons @DU\nHelp\nStatus\nAbout\nCareers\nPress\nBlog\nPrivacy\nTerms\nText to speech\nTeams', 'url': 'https://uselessai.in/connecting-fabric-workspace-with-azure-blob-storage-trusted-workspace-connection-for-production-9f7c24a66d1b?sk=ffc921c05f711fc5b223ce48df4f1e85'}
{'title': 'How to maintain sanity between DEV-STG-PROD in Fabric? — Tracking Changes via Deployment Pipeline', 'headings': ['UselessAI.in', 'How to communicate between workspaces in Fabric?', 'Published in UselessAI.in', 'Written by Shresth Shukla', 'No responses yet'], 'content': 'Sign up\nSign in\nSign up\nSign in\nHome\nLibrary\nStories\nStats\nHome\nNewsletter\nAbout\nFollow publication\nWE READ. WE BUILD. \u200a—\u200aLearning AI by reading and building.\nFollow publication\nMember-only story\nShresth Shukla\nFollow\nUselessAI.in\n--\nShare\nMy blogs are 100% free to read. Stuck behind Paywall? Read this blog for FREE —Click Here\nEvery good data project goes through three stages — Development, Testing (often called Staging), and Production. Somewhere in between, there comes a situation where you make direct changes in the staging environment — either during testing or due to manual effort required after deploying certain items from development to staging.\nFor example, Fabric currently doesn’t support deploying warehouse connections automatically, so you might need to change this manually in another environment after deployment. A similar situation could happen with some parts of the code — like making manual entries in tables post-deployment or fixing bugs directly in staging during testing.\nAnd this is where things get messy. If you make changes in the staging notebook but don’t immediately apply them to the development environment, you might face issues later. This is a common mistake — people often fix bugs quickly in testing but forget to sync those changes back to development.\nHello all, welcome to this Fabric series, where we share interesting content about development and deployment on Microsoft Fabric.\n--\n--\nWE READ. WE BUILD. \u200a—\u200aLearning AI by reading and building.\nData & AI @ EY, Freelance Technical Writer (AI/ML/GenAI Domain) MTech @BITS Pilani Maths Hons @DU\nHelp\nStatus\nAbout\nCareers\nPress\nBlog\nPrivacy\nTerms\nText to speech\nTeams', 'url': 'https://theshresthshukla.medium.com/how-to-maintain-sanity-between-dev-stg-prod-in-fabric-tracking-changes-via-deployment-pipeline-984cb201f5d2?sk=9347f57e88d8beed2fc7783a6588998f'}
{'title': 'Microsoft Fabric\u200aWarehouse Deployment Issue(s) and Potential Solution(s)\u200a— DmsImportDatabaseException', 'headings': ['UselessAI.in', 'HOW TO SOLVE THIS ERROR — “DMSIMPORTDATABASEEXCEPTION”', 'How to deploy warehouses in Microsoft Fabric without database exception?', 'Published in UselessAI.in', 'Written by Shresth Shukla', 'Responses (1)'], 'content': 'Sign up\nSign in\nSign up\nSign in\nHome\nLibrary\nStories\nStats\nHome\nNewsletter\nAbout\nFollow publication\nWE READ. WE BUILD. \u200a—\u200aLearning AI by reading and building.\nFollow publication\nMember-only story\nShresth Shukla\nFollow\nUselessAI.in\n--\n1\nShare\nNOTE: If you are unable to read this article due to a membership restriction, you can access it for free using this link — CLICK HERE. ❤\nWho am I? -> Hi, Shresth Shukla this side. I’m currently working with one of the Data and AI teams at EY and use Microsoft Fabric in my day-to-day tasks related to Data Engineering, Analytics & AI. Writing this blog post was fun, and I learned a lot in the process. So, I’m sharing it with you all. Hope you like it and learn from it! If you do, give it 50 claps 👏 — it’ll motivate me to write more about Data and AI.\nFirst things first! This will be a short and very specific blog about the issue I faced during the deployment of a warehouse in Microsoft Fabric. I needed to fix it to ensure it went into the testing environment!\nNote that before you even think about deployment, you need to be an admin of the workspace, and you should have a Microsoft subscription, as mentioned in their documentation —\n--\n--\n1\nWE READ. WE BUILD. \u200a—\u200aLearning AI by reading and building.\nData & AI @ EY, Freelance Technical Writer (AI/ML/GenAI Domain) MTech @BITS Pilani Maths Hons @DU\nHelp\nStatus\nAbout\nCareers\nPress\nBlog\nPrivacy\nTerms\nText to speech\nTeams', 'url': 'https://uselessai.in/microsoft-fabric-warehouse-deployment-issue-s-and-potential-solution-s-9ad360411f7a?sk=8ef1c60aa05023d443a5d00212cd4198'}
{'title': 'Microsoft Fabric — Stored Procedure Not Reflecting Warehouse Connection Change in Data Factory', 'headings': ['UselessAI.in', 'Stored procedure activities do not persist warehouse connection changes. :)', 'How to fix warehouse connection in stored procedure activity after pipeline deployment without deleting activity?', 'Published in UselessAI.in', 'Written by Shresth Shukla', 'No responses yet'], 'content': 'Sign up\nSign in\nSign up\nSign in\nHome\nLibrary\nStories\nStats\nHome\nNewsletter\nAbout\nFollow publication\nWE READ. WE BUILD. \u200a—\u200aLearning AI by reading and building.\nFollow publication\nMember-only story\nShresth Shukla\nFollow\nUselessAI.in\n--\nShare\nNOTE: If you are unable to read this article due to a membership restriction, you can access it for free using this link — CLICK HERE. ❤\nWho am I? -> Hi, Shresth Shukla this side. I’m currently working with one of the Data and AI teams at EY and use Microsoft Fabric in my day-to-day tasks related to Data Engineering, Analytics & AI. Writing this blog post was fun, and I learned a lot in the process. So, I’m sharing it with you all. Hope you like it and learn from it! If you do, give it 50 claps 👏 — it’ll motivate me to write more about Data and AI.\nHi everybody, this is Part 2 of the Microsoft Fabric Series. In the first part, we learned and explored potential issues that might arise during warehouse deployment. A very common issue occurs when deploying for the first time and moving from the development to the staging/test environment. You can find it here.\n--\n--\nWE READ. WE BUILD. \u200a—\u200aLearning AI by reading and building.\nData & AI @ EY, Freelance Technical Writer (AI/ML/GenAI Domain) MTech @BITS Pilani Maths Hons @DU\nHelp\nStatus\nAbout\nCareers\nPress\nBlog\nPrivacy\nTerms\nText to speech\nTeams', 'url': 'https://uselessai.in/microsoft-fabric-stored-procedure-not-reflecting-warehouse-connection-change-in-data-factory-e0dc96e6ce83'}
[35]
[36]
[37]
[38]
[39]
Reference : 0 -> Title - How to maintain sanity between DEV-STG-PROD in Fabric? — Tracking Changes via Deployment Pipeline, Headings - ['UselessAI.in', 'How to communicate between workspaces in Fabric?', 'Published in UselessAI.in', 'Written by Shresth Shukla', 'No responses yet'], Content - Sign up Sign in Sign up Sign in Home Library Stories Stats Home Newsletter About Follow publication WE READ. WE BUILD. — Learning AI by reading and building. Follow publication Member-only story Shresth Shukla Follow UselessAI.in -- Share My blogs are 100% free to read. Stuck behind Paywall? Read this blog for FREE —Click Here Every good data project goes through three stages — Development, Testing (often called Staging), and Production. Somewhere in between, there comes a situation where you make direct changes in the staging environment — either during testing or due to manual effort required after deploying certain items from development to staging. For example, Fabric currently doesn’t support deploying warehouse connections automatically, so you might need to change this manually in another environment after deployment. A similar situation could happen with some parts of the code — like making manual entries in tables post-deployment or fixing bugs directly in staging during testing. And this is where things get messy. If you make changes in the staging notebook but don’t immediately apply them to the development environment, you might face issues later. This is a common mistake — people often fix bugs quickly in testing but forget to sync those changes back to development. Hello all, welcome to this Fabric series, where we share interesting content about development and deployment on Microsoft Fabric. -- -- WE READ. WE BUILD. — Learning AI by reading and building. Data & AI @ EY, Freelance Technical Writer (AI/ML/GenAI Domain) MTech @BITS Pilani Maths Hons @DU Help Status About Careers Press Blog Privacy Terms Text to speech Teams Reference : 1 -> Title - Microsoft Fabric Warehouse Deployment Issue(s) and Potential Solution(s) — DmsImportDatabaseException, Headings - ['UselessAI.in', 'HOW TO SOLVE THIS ERROR — “DMSIMPORTDATABASEEXCEPTION”', 'How to deploy warehouses in Microsoft Fabric without database exception?', 'Published in UselessAI.in', 'Written by Shresth Shukla', 'Responses (1)'], Content - Sign up Sign in Sign up Sign in Home Library Stories Stats Home Newsletter About Follow publication WE READ. WE BUILD. — Learning AI by reading and building. Follow publication Member-only story Shresth Shukla Follow UselessAI.in -- 1 Share NOTE: If you are unable to read this article due to a membership restriction, you can access it for free using this link — CLICK HERE. ❤ Who am I? -> Hi, Shresth Shukla this side. I’m currently working with one of the Data and AI teams at EY and use Microsoft Fabric in my day-to-day tasks related to Data Engineering, Analytics & AI. Writing this blog post was fun, and I learned a lot in the process. So, I’m sharing it with you all. Hope you like it and learn from it! If you do, give it 50 claps 👏 — it’ll motivate me to write more about Data and AI. First things first! This will be a short and very specific blog about the issue I faced during the deployment of a warehouse in Microsoft Fabric. I needed to fix it to ensure it went into the testing environment! Note that before you even think about deployment, you need to be an admin of the workspace, and you should have a Microsoft subscription, as mentioned in their documentation — -- -- 1 WE READ. WE BUILD. — Learning AI by reading and building. Data & AI @ EY, Freelance Technical Writer (AI/ML/GenAI Domain) MTech @BITS Pilani Maths Hons @DU Help Status About Careers Press Blog Privacy Terms Text to speech Teams Reference : 2 -> Title - Microsoft Fabric — Stored Procedure Not Reflecting Warehouse Connection Change in Data Factory, Headings - ['UselessAI.in', 'Stored procedure activities do not persist warehouse connection changes. :)', 'How to fix warehouse connection in stored procedure activity after pipeline deployment without deleting activity?', 'Published in UselessAI.in', 'Written by Shresth Shukla', 'No responses yet'], Content - Sign up Sign in Sign up Sign in Home Library Stories Stats Home Newsletter About Follow publication WE READ. WE BUILD. — Learning AI by reading and building. Follow publication Member-only story Shresth Shukla Follow UselessAI.in -- Share NOTE: If you are unable to read this article due to a membership restriction, you can access it for free using this link — CLICK HERE. ❤ Who am I? -> Hi, Shresth Shukla this side. I’m currently working with one of the Data and AI teams at EY and use Microsoft Fabric in my day-to-day tasks related to Data Engineering, Analytics & AI. Writing this blog post was fun, and I learned a lot in the process. So, I’m sharing it with you all. Hope you like it and learn from it! If you do, give it 50 claps 👏 — it’ll motivate me to write more about Data and AI. Hi everybody, this is Part 2 of the Microsoft Fabric Series. In the first part, we learned and explored potential issues that might arise during warehouse deployment. A very common issue occurs when deploying for the first time and moving from the development to the staging/test environment. You can find it here. -- -- WE READ. WE BUILD. — Learning AI by reading and building. Data & AI @ EY, Freelance Technical Writer (AI/ML/GenAI Domain) MTech @BITS Pilani Maths Hons @DU Help Status About Careers Press Blog Privacy Terms Text to speech Teams
[39]
Generate Podcast using Gemini and ElevenLabs
[40]
[41]
Extracted Keywords: ['warehouse deployment', 'deployment fabric', 'deploying fabric', 'deploy warehouse', 'deploy warehouses']
[42]
Reference : 0 -> Title - Microsoft Fabric Warehouse Deployment Issue(s) and Potential Solution(s) — DmsImportDatabaseException, Headings - ['UselessAI.in', 'HOW TO SOLVE THIS ERROR — “DMSIMPORTDATABASEEXCEPTION”', 'How to deploy warehouses in Microsoft Fabric without database exception?', 'Published in UselessAI.in', 'Written by Shresth Shukla', 'Responses (1)'], Content - Sign up Sign in Sign up Sign in Home Library Stories Stats Home Newsletter About Follow publication WE READ. WE BUILD. — Learning AI by reading and building. Follow publication Member-only story Shresth Shukla Follow UselessAI.in -- 1 Share NOTE: If you are unable to read this article due to a membership restriction, you can access it for free using this link — CLICK HERE. ❤ Who am I? -> Hi, Shresth Shukla this side. I’m currently working with one of the Data and AI teams at EY and use Microsoft Fabric in my day-to-day tasks related to Data Engineering, Analytics & AI. Writing this blog post was fun, and I learned a lot in the process. So, I’m sharing it with you all. Hope you like it and learn from it! If you do, give it 50 claps 👏 — it’ll motivate me to write more about Data and AI. First things first! This will be a short and very specific blog about the issue I faced during the deployment of a warehouse in Microsoft Fabric. I needed to fix it to ensure it went into the testing environment! Note that before you even think about deployment, you need to be an admin of the workspace, and you should have a Microsoft subscription, as mentioned in their documentation — -- -- 1 WE READ. WE BUILD. — Learning AI by reading and building. Data & AI @ EY, Freelance Technical Writer (AI/ML/GenAI Domain) MTech @BITS Pilani Maths Hons @DU Help Status About Careers Press Blog Privacy Terms Text to speech Teams Reference : 1 -> Title - How to maintain sanity between DEV-STG-PROD in Fabric? — Tracking Changes via Deployment Pipeline, Headings - ['UselessAI.in', 'How to communicate between workspaces in Fabric?', 'Published in UselessAI.in', 'Written by Shresth Shukla', 'No responses yet'], Content - Sign up Sign in Sign up Sign in Home Library Stories Stats Home Newsletter About Follow publication WE READ. WE BUILD. — Learning AI by reading and building. Follow publication Member-only story Shresth Shukla Follow UselessAI.in -- Share My blogs are 100% free to read. Stuck behind Paywall? Read this blog for FREE —Click Here Every good data project goes through three stages — Development, Testing (often called Staging), and Production. Somewhere in between, there comes a situation where you make direct changes in the staging environment — either during testing or due to manual effort required after deploying certain items from development to staging. For example, Fabric currently doesn’t support deploying warehouse connections automatically, so you might need to change this manually in another environment after deployment. A similar situation could happen with some parts of the code — like making manual entries in tables post-deployment or fixing bugs directly in staging during testing. And this is where things get messy. If you make changes in the staging notebook but don’t immediately apply them to the development environment, you might face issues later. This is a common mistake — people often fix bugs quickly in testing but forget to sync those changes back to development. Hello all, welcome to this Fabric series, where we share interesting content about development and deployment on Microsoft Fabric. -- -- WE READ. WE BUILD. — Learning AI by reading and building. Data & AI @ EY, Freelance Technical Writer (AI/ML/GenAI Domain) MTech @BITS Pilani Maths Hons @DU Help Status About Careers Press Blog Privacy Terms Text to speech Teams Reference : 2 -> Title - Microsoft Fabric — Stored Procedure Not Reflecting Warehouse Connection Change in Data Factory, Headings - ['UselessAI.in', 'Stored procedure activities do not persist warehouse connection changes. :)', 'How to fix warehouse connection in stored procedure activity after pipeline deployment without deleting activity?', 'Published in UselessAI.in', 'Written by Shresth Shukla', 'No responses yet'], Content - Sign up Sign in Sign up Sign in Home Library Stories Stats Home Newsletter About Follow publication WE READ. WE BUILD. — Learning AI by reading and building. Follow publication Member-only story Shresth Shukla Follow UselessAI.in -- Share NOTE: If you are unable to read this article due to a membership restriction, you can access it for free using this link — CLICK HERE. ❤ Who am I? -> Hi, Shresth Shukla this side. I’m currently working with one of the Data and AI teams at EY and use Microsoft Fabric in my day-to-day tasks related to Data Engineering, Analytics & AI. Writing this blog post was fun, and I learned a lot in the process. So, I’m sharing it with you all. Hope you like it and learn from it! If you do, give it 50 claps 👏 — it’ll motivate me to write more about Data and AI. Hi everybody, this is Part 2 of the Microsoft Fabric Series. In the first part, we learned and explored potential issues that might arise during warehouse deployment. A very common issue occurs when deploying for the first time and moving from the development to the staging/test environment. You can find it here. -- -- WE READ. WE BUILD. — Learning AI by reading and building. Data & AI @ EY, Freelance Technical Writer (AI/ML/GenAI Domain) MTech @BITS Pilani Maths Hons @DU Help Status About Careers Press Blog Privacy Terms Text to speech Teams
[44]
[48]
[{'speaker': 'Shresth',
'text': "Hey everyone, and welcome to the podcast! Today, we're diving deep "
'into the world of Fabric deployments, particularly around '
'warehouses and those tricky views.'},
{'speaker': 'Arjun',
'text': "Yeah, deployments can be a real headache. Especially when you're "
'dealing with complex dependencies.'},
{'speaker': 'Geet',
'text': "Absolutely! I've run into so many roadblocks. It's like navigating "
'a maze blindfolded sometimes.'},
{'speaker': 'Shresth',
'text': 'Tell me about it! In a recent article I wrote, I discussed some of '
'the common issues and solutions, building on a previous post where '
"I tackled the dreaded 'DmsImportDatabaseException'."},
{'speaker': 'Arjun',
'text': "Oh, I remember that one. It's usually related to missing tables in "
'the target lakehouse, right?'},
{'speaker': 'Shresth',
'text': 'Exactly. Fabric expects all dependencies to be deployed before the '
'main item. So, if your pipeline uses a notebook, deploy the '
"notebook first. Use the 'Select related' option, it's a lifesaver."},
{'speaker': 'Geet',
'text': "And for Power BI reports, 'View Lineage' is your best friend. Helps "
'you track down those hidden connections.'},
{'speaker': 'Arjun',
'text': "Good tips. But I've found that even with all dependencies in place, "
'warehouse deployments can still fail.'},
{'speaker': 'Shresth',
'text': "Right. And I've recently discovered another culprit: views. "
'Specifically, views that reference other views within the same '
"warehouse. Fabric doesn't support this yet, leading to that same "
"'DmsImportDatabaseException' error."},
{'speaker': 'Geet',
'text': "Ugh, that sounds frustrating. So, what's the workaround?"},
{'speaker': 'Shresth',
'text': 'Well, the current solution is a bit of a manual process. You have '
'to delete the view from the source workspace and recreate it in the '
'target after deployment. Not ideal, I know.'},
{'speaker': 'Arjun',
'text': "Hmm, a bit of a pain. Hopefully, they'll fix that soon. It seems "
'like a pretty fundamental feature.'},
{'speaker': 'Shresth',
'text': "I agree. I've highlighted this issue in my article, hoping to raise "
'awareness and maybe nudge the Fabric team in the right direction.'},
{'speaker': 'Geet',
'text': "It's important to share these experiences. It helps everyone in "
'the community avoid similar pitfalls.'},
{'speaker': 'Shresth',
'text': "Absolutely. I've also written about other deployment challenges, "
'like maintaining sanity between development, staging, and '
'production environments. Things can get messy when you have to '
'make manual changes in staging.'},
{'speaker': 'Arjun',
'text': 'Oh yeah, the classic dev-staging-prod synchronization problem. '
"It's so easy to lose track of changes."},
{'speaker': 'Geet',
'text': "I've been there. Fixing a bug directly in staging and then "
"forgetting to update the development environment. It's a recipe for "
'disaster.'},
{'speaker': 'Shresth',
'text': "And another tricky one I've encountered is with stored procedures "
'in Data Factory pipelines. When you change the warehouse '
"connection, it doesn't always persist after deployment. Super "
'annoying.'},
{'speaker': 'Arjun',
'text': 'So, you end up having to manually update the connection in the '
'stored procedure activity every time?'},
{'speaker': 'Shresth',
'text': "Exactly. It's a tedious workaround, and I'm hoping for a better "
'solution soon. But for now, these are the realities of working '
"with Fabric deployments. It has its quirks, but it's a powerful "
'platform nonetheless.'},
{'speaker': 'Geet',
'text': 'Definitely powerful. And these kinds of discussions are crucial '
'for navigating the complexities and making the most of it.'},
{'speaker': 'Arjun',
'text': 'Totally agree. Sharing our experiences, workarounds, and '
'frustrations helps us all learn and improve. Thanks for bringing '
'these issues to light, Shresth.'},
{'speaker': 'Shresth',
'text': 'My pleasure. Hopefully, these insights will save some of you from '
'pulling your hair out during your next Fabric deployment. And be '
'sure to check out my articles for more details and tips. Until next '
'time, happy deploying!'}]
[60]
Conversation saved to ./conversation/conversation.json
[56]
['audio-files/0_Shresth.mp3', , 'audio-files/1_Arjun.mp3', , 'audio-files/2_Geet.mp3', , 'audio-files/3_Shresth.mp3', , 'audio-files/4_Arjun.mp3', , 'audio-files/5_Shresth.mp3', , 'audio-files/6_Geet.mp3', , 'audio-files/7_Arjun.mp3', , 'audio-files/8_Shresth.mp3', , 'audio-files/9_Geet.mp3', , 'audio-files/10_Shresth.mp3', , 'audio-files/11_Arjun.mp3', , 'audio-files/12_Shresth.mp3', , 'audio-files/13_Geet.mp3', , 'audio-files/14_Shresth.mp3', , 'audio-files/15_Arjun.mp3', , 'audio-files/16_Geet.mp3', , 'audio-files/17_Shresth.mp3', , 'audio-files/18_Arjun.mp3', , 'audio-files/19_Shresth.mp3', , 'audio-files/20_Geet.mp3', , 'audio-files/21_Arjun.mp3', , 'audio-files/22_Shresth.mp3']
[57]
'podcast.mp3'
And that's it. Listen to this podcast and Stay tuned for the next one.