cship2 2 days ago

Why is langchain still used, thought that not needed, can it just be replaced with some type of custom ETL functions. Or is that still defector ways to interacte with llam?

Lerc 2 days ago

I have had a quick play around with Deno notebooks. The pain points for me were in image and UI outputs. There is such a large base of support for the Python side, it will take quite a lot of work to catch up. The various python bits seem to be quite dependent on the implementation platform though. Colab, jupyter-lab and integrated into an editor are quite different beasts.

Being able to display an entity that carried a MessagePort and a closure for communication would enable a lot there.

  • jshen 2 days ago

    Can you give an example of UI and image outputs that don't work with deno? I'm truly curious.

    • paulgb a day ago

      In the article, there’s an example where a graph object needs to be:

      - converted to png

      - converted into a promise of an array buffer

      - awaited

      - converted into a Uint8Array

      - passed to Deno.jupyter.image

      Just to display it in the notebook. In Python Jupyter, you can just return many types and the library/Jupyter will figure out how to render them.

      A few lines of code isn’t much but in my experience for interactive coding to be productive you have to remove every bit of friction and boilerplate you can.

AbuAssar 2 days ago

after trying the instructions I found the following issues:

1- Deepseek-r1 <think></think> tags in the answer break the JSON validation.

2- Deno's LSP keeps marking any variable in the cell as not defined even though it is defined in a prior cell, which is annoying.

AbuAssar 2 days ago

missing from the article is that you need execute the following command to install Deno jupyter kernel:

deno jupyter --install

wilde a day ago

“We advise on technology”

proceeds to overengineer a hello world example while providing no demonstration of advantage over curl

linwangg 2 days ago

I like the idea of Jupyter supporting more languages, but isn’t the main issue that most AI tooling is deeply tied to Python? How do we solve that?

  • drillsteps5 6 hours ago

    By not building tooling in Python.

    llamacpp is a high performative open source solution capable of inference of large number of published LL models both in CPU and in GPU. It's written in C++.

    It's easy to download and build in Windows or Linux.

    It can be used as a command line tool, linked and used as a library from a variety of languages, including Python, or communicated with through a simple REST service which is also part of the same repo. It even has a simple Web frontend (built with React I believe) which allows you to use it for simple conversations (no bells and whistles).

    And yet the author is using Ollama which itself is a wrapper around llamacpp (as most of them are) written in Python.

    We're creating the problems that need soling.

  • lairv a day ago

    Don't fix something that's not broken

nsonha 21 hours ago

tldr: article that has some code and uses tools mentioned in the title.