進捗を失わない:VS Codeでプロ仕様のJupyterワークフローをセットアップする(Colabのタイムアウトともおさらば!)

Dev.to / 2026/4/21

💬 オピニオンDeveloper Stack & InfrastructureTools & Practical UsageModels & Research

要点

  • この記事は、Google Colabのセッションタイムアウトや永続性の不足が本格的なAI作業の大きな悩みだとし、ローカルファーストの構成を推奨しています。
  • VS CodeにJupyter拡張機能を入れることで、コード補完が向上し、ローカル環境の管理もしやすい“本格的なノートブック環境”を作れると説明しています。
  • venvで専用のPython仮想環境を作成し、必要なパッケージをインストールしたうえで、ipykernelを使ってJupyterカーネルとして登録する手順を示します。
  • VS Codeでノートブックを起動し、カーネル(インタプリタ)を選択する方法と、インタプリタが表示されない場合の対処も解説しています。
  • 長時間の学習ではTensorFlowのModelCheckpointコールバックを使って重みをディスクに保存し、再起動しても学習進捗を失わないようにする点を強調しています。

The Problem: The "Colab Heartbreak"

We’ve all been there: You’re 40 epochs into a multimodal model for crop disease detection, and suddenly—Session Terminated. Your variables are gone, your local drive isn't synced, and you have to start over.
While Google Colab is great for quick scripts, serious AI projects (like my current multimodal yield prediction system) need persistence and the power of a local IDE.

Why VS Code for Jupyter?

  1. IntelliSense: Better code completion than any browser-based notebook.
  2. Local Environment Control: No more !pip install every time you open the file.
  3. Git Integration: Version control your experiments easily.
  4. The Best of Both Worlds: Use your local UI while connecting to powerful cloud GPUs (like Paperspace or Saturn Cloud).

Step 1: The Essentials

First, grab the Jupyter Extension from the VS Code Marketplace. This transforms VS Code into a full-featured notebook editor.

Step 2: Virtual Environment (The Secret to Stability)

Don't install your ML libraries globally! Create a dedicated environment for your project:

Create the env

python -m venv crop_ai_env

Activate it

source crop_ai_env/bin/activate # Mac/Linux# .\crop_ai_env\Scripts\activate # Windows

Install the kernel connector

pip install ipykernel tensorflow-gpu pillow pandas

Step 3: Launching the Notebook

  1. Create a file named experiment.ipynb.
  2. Look at the top right corner of the editor. Click Select Kernel.
  3. Choose your crop_ai_env.

Tip: If you don't see it, press Ctrl+Shift+P and run "Python: Select Interpreter" first.

Step 4: Pro-Tip for Long Training Runs

If you are doing heavy transfer learning (like MobileNetV3), use a Checkpoint Callback. This ensures that even if your computer restarts, your model weights are safe on your drive:

checkpoint_path = "checkpoints/crop_model_v1.ckpt"cp_callback = tf.keras.callbacks.ModelCheckpoint(
filepath=checkpoint_path,
save_weights_only=True,
verbose=1
)

Your model.fit() now has a safety net

model.fit(train_data, epochs=50, callbacks=[cp_callback])

Conclusion

Switching from Colab to VS Code + a local virtual environment (synced via GitHub or a Cloud Drive) has saved me hours of re-training time. If you're building lightweight TFLite models for offline use, this local-first workflow is a game-changer.

What’s your biggest frustration with browser-based notebooks? Let’s discuss in the comments!