Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot read properties of undefined (reading 'split') #164

Open
lennessyy opened this issue Mar 20, 2023 · 38 comments
Open

Cannot read properties of undefined (reading 'split') #164

lennessyy opened this issue Mar 20, 2023 · 38 comments

Comments

@lennessyy
Copy link

I followed the exact instructions to install the models. I am able to get the Web UI up and running, but when I try to submit a prompt, I get this error:

/Users/lennessy/.nvm/versions/node/v18.0.0/lib/node_modules/dalai/index.js:158
    let [Core, Model] = req.model.split(".")
                                  ^

TypeError: Cannot read properties of undefined (reading 'split')
    at Dalai.query (/Users/lennessy/.nvm/versions/node/v18.0.0/lib/node_modules/dalai/index.js:158:35)
    at Socket.<anonymous> (/Users/lennessy/.nvm/versions/node/v18.0.0/lib/node_modules/dalai/index.js:413:20)
    at Socket.emit (node:events:527:28)
    at Socket.emitUntyped (/Users/lennessy/.nvm/versions/node/v18.0.0/lib/node_modules/dalai/node_modules/socket.io/dist/typed-events.js:69:22)
    at /Users/lennessy/.nvm/versions/node/v18.0.0/lib/node_modules/dalai/node_modules/socket.io/dist/socket.js:703:39
    at process.processTicksAndRejections (node:internal/process/task_queues:77:11)
@rupakhetibinit
Copy link

rupakhetibinit commented Mar 21, 2023

you probably don't have the models installed. What does it print when you just start the webui. It should show the model installed path and folders.

> query: { method: 'installed' }
modelsPath C:\Users\user\dalai\alpaca\models
{ modelFolders: [ '7B' ] }
exists 7B
modelsPath C:\Users\user\dalai\llama\models
{ modelFolders: [] }

@jorgepvenegas
Copy link

Having the same error here. Will take a look at that and share any findings @rupakhetibinit

@mpnsk
Copy link

mpnsk commented Mar 22, 2023

maybe it's obvious but it took me a bit:

If you installed the model to a custom path with --home like described at https://github.com/cocktailpeanut/dalai#1-installing-models-to-a-custom-path

you also have to serve it from that custom path

@udit
Copy link

udit commented Mar 23, 2023

I have the same issue. I have installed the model to a custom path and serving it from the same path too. Logs clearly show it has detected the models, but getting the same error

npx dalai serve --home D:\Programs\dalai
mkdir D:\Programs\dalai
Server running on http://localhost:3000/
> query: { method: 'installed', models: [] }
modelsPath D:\Programs\dalai\alpaca\models
{ modelFolders: [ '7B' ] }
exists 7B
modelsPath D:\Programs\dalai\llama\models
{ modelFolders: [] }
> query: {
  seed: -1,
  threads: 4,
  n_predict: 200,
  top_k: 40,
  top_p: 0.9,
  temp: 0.8,
  repeat_last_n: 64,
  repeat_penalty: 1.3,
  debug: false,
  models: [ 'alpaca.7B' ],
  prompt: 'test',
  id: 'TS-1679548930678-30690'
}
D:\Programs\Scoop\persist\nodejs\cache\_npx\3c737cbb02d79cc9\node_modules\dalai\index.js:219
    let [Core, Model] = req.model.split(".")
                                  ^

TypeError: Cannot read properties of undefined (reading 'split')
    at Dalai.query (D:\Programs\Scoop\persist\nodejs\cache\_npx\3c737cbb02d79cc9\node_modules\dalai\index.js:219:35)
    at Socket.<anonymous> (D:\Programs\Scoop\persist\nodejs\cache\_npx\3c737cbb02d79cc9\node_modules\dalai\index.js:534:20)
    at Socket.emit (node:events:512:28)
    at Socket.emitUntyped (D:\Programs\Scoop\persist\nodejs\cache\_npx\3c737cbb02d79cc9\node_modules\socket.io\dist\typed-events.js:69:22)
    at D:\Programs\Scoop\persist\nodejs\cache\_npx\3c737cbb02d79cc9\node_modules\socket.io\dist\socket.js:703:39
    at process.processTicksAndRejections (node:internal/process/task_queues:77:11)

Node.js v19.8.1

@JostBrand
Copy link

Same problem without using a custom path. The model is in the mentioned folder.

mkdir /home/jost/dalai
Server running on http://localhost:3000/
> query: { method: 'installed' }
modelsPath /home/jost/dalai/alpaca/models
{ modelFolders: [] }
modelsPath /home/jost/dalai/llama/models
{ modelFolders: [ '7B' ] }
> query: {
  seed: -1,
  threads: 4,
  n_predict: 200,
  top_k: 40,
  top_p: 0.9,
  temp: 0.1,
  repeat_last_n: 64,
  repeat_penalty: 1.3,
  debug: false,
  models: [],
  prompt: 'hey how are you?',
  id: 'TS-1679556212939-12038'
}
/home/jost/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:159
    let [Core, Model] = req.model.split(".")
                                  ^

TypeError: Cannot read properties of undefined (reading 'split')
    at Dalai.query (/home/jost/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:159:35)
    at Socket.<anonymous> (/home/jost/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:413:20)
    at Socket.emit (node:events:512:28)
    at Socket.emitUntyped (/home/jost/.npm/_npx/3c737cbb02d79cc9/node_modules/socket.io/dist/typed-events.js:69:22)
    at /home/jost/.npm/_npx/3c737cbb02d79cc9/node_modules/socket.io/dist/socket.js:703:39
    at process.processTicksAndRejections (node:internal/process/task_queues:77:11)

@rupakhetibinit
Copy link

which version are you guys using? what does npx dalai --version print? update to latest version and try again. For updating clear the cache using npx clear-npx-cache and npm cache clean --force.
Then reinstall using npx dalai alpaca install 7B and npx dalai serve

@udit
Copy link

udit commented Mar 23, 2023

I was already on 0.3.1, but I cleared the cache and reinstalled as you said but the issue still persists.

@rupakhetibinit
Copy link

Check one thing for me. Does the directory dalai/alpaca/build/Release contain 3 files gglm.lib, main.exe and quantize.exe? Also use node lts version 18.15.0

@kareivis7n
Copy link

kareivis7n commented Mar 23, 2023

Well I have 0.3.1 + 18.15.0 and my folder structor looks exactly as this instead in Windows 10 ( I don's see any "build" folder)

https://raw.githubusercontent.com/cocktailpeanut/dalai/main/docs/alpaca_7b.png

@rupakhetibinit
Copy link

I'll try and get back to you after I do a clean install from scratch.

@rupakhetibinit
Copy link

Did everyone follow the steps for the Visual Studio installation if you're on windows? This needs visual studio and necessary components installed if you're on windows. Also run this in command prompt instead of powershell.

@kareivis7n
Copy link

Yes, I did all the steps. actually, you get a different error If there weren't installed.

@rupakhetibinit
Copy link

I deleted visual studio. I deleted all the things I had. I just ran a totally clean install on my laptop, again. and it just works fine for me. I have no idea what causes this error.

@udit
Copy link

udit commented Mar 23, 2023

Does the directory dalai/alpaca/build/Release contain 3 files gglm.lib, main.exe and quantize.exe?

Yes, all three are present

Also use node lts version 18.15.0

Will try

This needs visual studio and necessary components installed if you're on windows.

Yes, I had not installed it initially and it was throwing some other error. However I have only installed Desktop environment for c++ from VS. I use Scoop as my package manager on windows so I have python and NodeJS installed from there.

@mooggsentry
Copy link

I had the same issue and I was using Firefox. I switched to internet explorer and it work after that.

@udit
Copy link

udit commented Mar 23, 2023

I had the same issue and I was using Firefox. I switched to internet explorer and it work after that.

Wow, that actually solved the issue. I was using Firefox too.

@kareivis7n
Copy link

Actually yeah, If I open it in Edge or Firefox private, works. But now the responses are wierd like "$��������! ⁇ ����#" Does your show normal text?

@dennis-gonzales
Copy link

also able to replicate same issue

@skrellum
Copy link

Had the same issue. I changed the index.js line 219 from:

let [Core, Model] = req.model.split(".")

to:

let [Core, Model] = req.models[0].split(".")

Works for me after this small hack, both NodeJS 19.8.1 and 18.15.0

@udit
Copy link

udit commented Mar 23, 2023

Does your show normal text?

Normal text - yes, intelligent text- not quite.

@osama-qasim
Copy link

Had the same issue. I changed the index.js line 219 from:

let [Core, Model] = req.model.split(".")

to:

let [Core, Model] = req.models[0].split(".")

Works for me after this small hack, both NodeJS 19.8.1 and 18.15.0

I fixed it by adding the following line of code after line 303 in the file: bin/web/views/index.ejs

config.model = document.querySelector('#model').value;

@Kapowpenguin1
Copy link

also having this problem. i find it a bit strange that "models" doesn't show anything, even when "7B" appears in the command prompt
image

@Chesterfild
Copy link

Chesterfild commented Mar 25, 2023

also having this problem. i find it a bit strange that "models" doesn't show anything, even when "7B" appears in the command prompt image

Hi!
I fix it by simply following steps from this massage:

which version are you guys using? what does npx dalai --version print? update to latest version and try again. For updating clear the cache using npx clear-npx-cache and npm cache clean --force.
Then reinstall using npx dalai alpaca install 7B and npx dalai serve

After this, i get another error:

C:\Users\Otto\AppData\Local\npm-cache\_npx\3c737cbb02d79cc9\node_modules\dalai\index.js:220
    Model = Model.toUpperCase()
                  ^

TypeError: Cannot read properties of undefined (reading 'toUpperCase')
    at Dalai.query (C:\Users\Otto\AppData\Local\npm-cache\_npx\3c737cbb02d79cc9\node_modules\dalai\index.js:220:19)
    at Socket.<anonymous> (C:\Users\Otto\AppData\Local\npm-cache\_npx\3c737cbb02d79cc9\node_modules\dalai\index.js:534:20)
    at Socket.emit (node:events:513:28)
    at Socket.emitUntyped (C:\Users\Otto\AppData\Local\npm-cache\_npx\3c737cbb02d79cc9\node_modules\socket.io\dist\typed-events.js:69:22)
    at C:\Users\Otto\AppData\Local\npm-cache\_npx\3c737cbb02d79cc9\node_modules\socket.io\dist\socket.js:703:39
    at process.processTicksAndRejections (node:internal/process/task_queues:77:11)

Node.js v18.15.0

At this point i stuck again. =\

@Tameflame
Copy link

Tameflame commented Mar 25, 2023

Having this same issue on WSL, accessing the server through chrome on windows. Anyone know how I can browse to the index.js file on bash? cd home/USER/.npm doesn't work.

Edit: Accessing on edge works fine

@Chesterfild
Copy link

Chesterfild commented Mar 25, 2023

Having this same issue on WSL, accessing the server through chrome on windows. Anyone know how I can browse to the index.js file on bash? cd home/USER/.npm doesn't work.

Edit: Accessing on edge works fine

Strange, all my attempts have failed with one or another error. Switching browsers doesn't seem to work for some reason, and any attempt to fix the error results in more ridiculous errors.

I guess I'll have to wait until it's fixed.

If you intrested, hear one of the attampt to fix code and error massages:

async query(req, cb) {
    if (!req.model) {
        console.log("No model specified")
        return
    }
    let [Core, Model] = req.model.split(".")
    if (!Model) {
        console.log("Invalid model specified")
        return
    }
    Model = Model.toUpperCase()

    console.log(`> query:`, req)
    if (req.method === "installed") {
        let models = await this.installed()
        for (let model of models) {
            cb(model)
        }
        cb('\n\n<end>')
        return
    }

    if (req.prompt && req.prompt.startsWith("/")) {
        try {
            const mod = require(`./cmds/${req.prompt.slice(1)}`)
            if (mod) {
                mod(this)
                return
            }
        } catch (e) {
            console.log("require log", e)
        }
    }

    if (!req.prompt) {
        return
    }

    console.log({ Core, Model })
    const cls = this.loadModel(Core, Model)
    if (!cls) {
        console.log(`Model not found: ${req.model}`)
        return
    }

    const model = new cls(this)
    if (model) {
        model[req.method](req.args, cb)
    }
}

Result is:
SyntaxError: Unexpected token '['

    at internalCompileFunction (node:internal/vm:73:18)
    at wrapSafe (node:internal/modules/cjs/loader:1176:20)
    at Module._compile (node:internal/modules/cjs/loader:1218:27)
    at Module._extensions..js (node:internal/modules/cjs/loader:1308:10)
    at Module.load (node:internal/modules/cjs/loader:1117:32)
    at Module._load (node:internal/modules/cjs/loader:958:12)
    at Module.require (node:internal/modules/cjs/loader:1141:19)
    at require (node:internal/modules/cjs/helpers:110:18)
    at Object.<anonymous> (C:\Users\Otto\AppData\Local\npm-cache\_npx\3c737cbb02d79cc9\node_modules\dalai\bin\cli.js:5:15)
    at Module._compile (node:internal/modules/cjs/loader:1254:14)

@mehdichekori
Copy link

I switched to Firefox and I am no longer getting this issue.

I was using chrome previously.

@Kapowpenguin1
Copy link

Kapowpenguin1 commented Mar 26, 2023

I fix it by simply following steps from this massage:

i did this multiple times, to no avail. still getting TypeError: Cannot read properties of undefined (reading 'split')

@dtrevithick
Copy link

Switching to incognito in edge browser seems to work. Will have to try other browsers as well.

@Kapowpenguin1
Copy link

Switching to incognito in edge browser seems to work. Will have to try other browsers as well.

how curious... only the alpaca 7b model seems to work properly, but only if the browser wasn't previously tainted by llama 7b. running alpaca 7b in incognito/fresh browser session fixes the error

@Kapowpenguin1
Copy link

Switching to incognito in edge browser seems to work. Will have to try other browsers as well.

how curious... only the alpaca 7b model seems to work properly, but only if the browser wasn't previously tainted by llama 7b. running alpaca 7b in incognito/fresh browser session fixes the error

so it seems like the problem with llama 7B was that it wasn't converted into ggml or w/e. i dug into the program files, found a directory for llama.cpp, and followed the usage directions after rerunning npx dalai llama install 7B bc apparently my previous llama 7B directory was messed up. then i moved llama from the dalai directory, into wsl, went through the usage directions for llama.cpp, then moved llama back into the original dalai directory. and now llama 7B shows up for me

@Kapowpenguin1
Copy link

hm... even being able to load llama 7b, i can't get any prompts to process... but that might be a completely different issue

@Chesterfild
Copy link

Switching to incognito in edge browser seems to work. Will have to try other browsers as well.

Thank you!
It's working now, but only when i run it in incognito mode.

@jojojoy99
Copy link

I switched to Firefox and I am no longer getting this issue.

I was using chrome previously.

I switched to firefox and it solved the problem

@Shoresh613
Copy link

Switching to Chrome (from Edge) in incognito mode works for me, but only the Alpaca 7B model shows up in the dropdown, though it shows:

Server running on http://localhost:3000/
> query: { method: 'installed', models: [] }
modelsPath /home/mikael/dalai/alpaca/models
{ modelFolders: [ '7B' ] }
exists 7B
modelsPath /home/mikael/dalai/llama/models
{ modelFolders: [ '7B' ] }

Wonder why the Llama model doesn't show up even though it's listed, and found, no?? Or is it that it doesn't find it in the Llama folder, as it doesn't say "exists 7B" below?

Here's the content of the folder:

~/dalai/llama/models$ ls 7B
checklist.chk  consolidated.00.pth  params.json

The .pth file is 13.5GB

@Kapowpenguin1
Copy link

Kapowpenguin1 commented Mar 28, 2023

Switching to Chrome (from Edge) in incognito mode works for me, but only the Alpaca 7B model shows up in the dropdown, though it shows:

Server running on http://localhost:3000/
> query: { method: 'installed', models: [] }
modelsPath /home/mikael/dalai/alpaca/models
{ modelFolders: [ '7B' ] }
exists 7B
modelsPath /home/mikael/dalai/llama/models
{ modelFolders: [ '7B' ] }

Wonder why the Llama model doesn't show up even though it's listed, and found, no?? Or is it that it doesn't find it in the Llama folder, as it doesn't say "exists 7B" below?

Here's the content of the folder:

~/dalai/llama/models$ ls 7B
checklist.chk  consolidated.00.pth  params.json

The .pth file is 13.5GB

if you want to get it to show up, you have to run through the usage process in the readme file under dalai/llama directory. i got it to show up by moving the folder into wsl, and going through the instructions, then moving it back. getting it to work is a different question tho, bc my prompts wouldn't process no matter how long i waited. idk why

@Gh05t-1337
Copy link

had the same issue and solved it. here's how i did it:

open the webUI, go to inspect element, then to the storage tab and to local storage. there should be a cookie called 'config'. it should have an entry called 'models'. if its empty, thats the reason why 'models' doesn't show anything and why you get an error. it should be an array containing your model in the format: core.model for example, it could be ['llama.7B'], so change it to your core and model.

'config' may also have an entry called 'model', change it to core.model as well. then try running your prompt again.

if it's still not working no matter how long you wait, check if you got the models downloaded. as soon as you click on 'Go' it executes:

path-to-dalai-llama/main --seed -1 --threads 4 --n_predict 200 --model models/7B/ggml-model-q4_0.bin --top_k 40 --top_p 0.9 --temp 0.8 --repeat_last_n 64 --repeat_penalty 1.3 -p "Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.

### Instruction:
>PROMPT

### Response:
"

as you see, it (in my case) needs the file models/7B/ggml-model-q4_0.bin. should be something similar in your case, you find this command in the terminal after clicking 'Go', so check how that file is called, and check if this file exists in your llama folder. if not, search for it on the internet and download it and put it in it's place.

if it's still not working after making sure you got the file, it may be because your model is too old (at least that was the case for me), in that case, you'll find a solution here: ggml-org/llama.cpp#361

you can also try executing the command from your terminal instead of using the webUI, to see what errors you get.

hope this helps!

@xhitijc2
Copy link

I was getting the same issue. Switching to firefox worked for me!

@Dannysdable
Copy link

I was getting the same issue. Switching to firefox not worked for me!
Node.js v20.3.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests