Community discussions

MikroTik App
 
User avatar
Phenek
just joined
Topic Author
Posts: 6
Joined: Tue Oct 03, 2023 11:05 am

[/file find name=$fileName] using 100% CPU usage

Tue Oct 10, 2023 1:20 pm

Hello,

The story line: I am trying to handle ble advertisements in offline mode.
So I have to save a json file of 4096 Bytes every 1 or 2 seconds.
Then I will send all files, and remove them when mqtt broker is reconnected.

I have an issue on the command [/file find name=$fileName]
when we have a lot of files it is currently taking 100% of the CPU.

Here a function to create a fileName with a counter at the end, to always have an unique file name.
:global getFileName do={
    :local counter 0
    :local fileName ("test/toto" . $counter . ".json")

    :while ([/file find name=$fileName] != "") do={
        :set counter ($counter + 1)
        :set fileName ("test/toto" . $counter . ".json")
    }
    :return $fileName
}

Report:
Less than 50 files, it use between 60% up to 100% of the CPU, return fast
About 100 files, it use 100% CPU, but at least return after 200ms.
More than 1000 files, it use 100% CPU, the command begin to return after 3 seconds, (I have a file to save every 1 or 2 seconds XD)

PS: My real fileName is determine with the Date & Time but if I have more than 4096 Bytes file, I create 2 files with the same name and the getFileName function will took twice the time to return...
Example:
2023-10-09-13:02:41-8580.json
2023-10-09-13:02:41-8581.json
 
optio
Long time Member
Long time Member
Posts: 686
Joined: Mon Dec 26, 2022 2:57 pm

Re: [/file find name=$fileName] using 100% CPU usage

Tue Oct 10, 2023 4:15 pm

Maybe if you share complete source of script(s) it would be possible to find how to optimize it. If you have such many files use different approach, let remote download files over ssh or ftp from ROS. You can set global variable to track filename counter for saving files, no need do find last by iterating counter. When is back to online you can just use for eg. http request trigger remote to start downloading and reset counter value.
 
User avatar
Amm0
Forum Guru
Forum Guru
Posts: 3557
Joined: Sun May 01, 2016 7:12 pm
Location: California

Re: [/file find name=$fileName] using 100% CPU usage

Tue Oct 10, 2023 4:27 pm

You might try adding a ":delay 0.1s" in the :while.

But the "find" operation is notoriously slow.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12025
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: [/file find name=$fileName] using 100% CPU usage

Tue Oct 10, 2023 7:53 pm

Or simply create a small file that containing last index used instead to do all that sh.t uselessly...
 
User avatar
Phenek
just joined
Topic Author
Posts: 6
Joined: Tue Oct 03, 2023 11:05 am

Re: [/file find name=$fileName] using 100% CPU usage

Wed Oct 18, 2023 4:20 pm

@optio HTTP SSH doesn't work here. I am saving files when the router is offline.

@Amm0 Adding a ":delay 1s" is already set up in my infinite loop that saves BLE advertisements every second.
But when there are already 1000 files saved, the [/file find name=$fileName] starts to return after 3 seconds, then I delay 1s.
However, the iteration duration increases drastically... I need an iteration every second, even with more than 1000 files.

@rextended For me, it's kind of normal to check if a filename already exists before creating a file.
Because the command file/add name=$filename contents=$content can throw the exception "File already exists" and end the process.
So, your workaround might work initially, but if for any reason the file already exists (unplug, hard reboot, reset file last index, your file isn't updated with the last index, etc.), then we could face a "File already exists" error and the process would end.

I can give you an example where [/file find name=$fileName] is using 100% CPU usage
But please take in consideration that in my case my loop is better elaborated, and files that I saved usually do not already exist.
To see the real issue please create 1000 files inside the router,
################################## Functions ##################################
:global getFileName do={
    :local prefix "test/toto"
    :local counter $1
    :local name ($prefix . $counter . ".json")
    :put ($name)
    :return $name
}

:global EpochTime do={
   :local ds [/system clock get date];
   :local months;
   :if ((([:pick $ds 9 11]-1)/4) != (([:pick $ds 9 11])/4)) do={
      :set months {"an"=0;"eb"=31;"ar"=60;"pr"=91;"ay"=121;"un"=152;"ul"=182;"ug"=213;"ep"=244;"ct"=274;"ov"=305;"ec"=335};
   } else={
      :set months {"an"=0;"eb"=31;"ar"=59;"pr"=90;"ay"=120;"un"=151;"ul"=181;"ug"=212;"ep"=243;"ct"=273;"ov"=304;"dec"=334};
   }
   :set ds (([:pick $ds 9 11]*365)+(([:pick $ds 9 11]-1)/4)+($months->[:pick $ds 1 3])+[:pick $ds 4 6]);
   :local ts [/system clock get time];
   :set ts (([:pick $ts 0 2]*60*60)+([:pick $ts 3 5]*60)+[:pick $ts 6 8]);
   :return ($ds*24*60*60 + $ts + 946684800 - [/system clock get gmt-offset]);
}


############################### Bluetooth Loop ################################
:put ("[*] Gathering Bluetooth info...")
:local counter 0
    :local content "Lorem ipsum dolor sit amet, consectetur adipiscing elit. Sed sed odio enim. Suspendisse pellentesque urna id sem lobortis, id vestibulum magna faucibus. Aenean eleifend risus sed est commodo, dignissim feugiat mauris mattis. Mauris auctor finibus ipsum eget eleifend. Aenean eget est nec nisl suscipit sagittis at nec purus. In quis nisi enim. Etiam sed arcu fringilla est dignissim porta non non enim. In interdum ultricies tortor vitae efficitur. Pellentesque accumsan bibendum suscipit. Vivamus nec aliquam lacus. Suspendisse vitae risus rhoncus, dignissim mauris non, cursus eros.

Etiam a ipsum purus. Mauris justo massa, hendrerit ut ipsum quis, eleifend cursus urna. Mauris facilisis turpis urna, at mattis velit sollicitudin at. Vivamus tincidunt, nulla non tincidunt condimentum, massa orci bibendum tellus, at convallis nisi sem ut arcu. Nullam eu imperdiet odio. Suspendisse dignissim mollis nibh, in laoreet sapien pretium a. Donec lacus nisi, ultricies ut leo eu, gravida scelerisque eros. Vestibulum sit amet ultricies justo, eu facilisis diam. Donec luctus diam ac neque consequat, nec tincidunt nisl lobortis. Praesent imperdiet consequat leo, a convallis ligula tempus ac.

Vestibulum elit lorem, vehicula id dolor ut, mollis lacinia tortor. Nullam congue magna fringilla turpis interdum, et maximus dolor hendrerit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Quisque consectetur ultrices tellus, eu feugiat sapien tempor eu. In eleifend maximus leo. Aliquam erat volutpat. Pellentesque ultricies facilisis congue. Donec et tellus tincidunt, pulvinar purus eget, imperdiet magna. In eget sem eros. Suspendisse commodo sollicitudin rutrum. Interdum et malesuada fames ac ante ipsum primis in faucibus. Suspendisse quis tortor non nulla vehicula mattis. Nulla luctus congue nisi, vel ultricies ex pulvinar in. Aenean sit amet risus pellentesque, dapibus elit quis, lobortis nisi. Nam molestie, turpis non fermentum congue, nibh erat finibus est, nec vestibulum dolor orci et magna. Aenean eu odio non diam dictum posuere eget vitae justo.

Mauris venenatis at quam eget ultricies. Pellentesque sodales neque dolor, in rutrum metus volutpat in. Aenean luctus diam id tortor hendrerit iaculis. Pellentesque mattis nulla id nisl tincidunt, in vehicula eros gravida. Duis vestibulum bibendum leo ut dictum. Duis quis sem at urna hendrerit posuere. Suspendisse id erat eu tortor placerat ullamcorper. Integer gravida justo sit amet condimentum iaculis. 

Vestibulum ornare, ante ac posuere suscipit, risus massa tempor erat, eu venenatis mi lectus ac erat. Pellentesque quis erat efficitur, rutrum purus in, sollicitudin justo. Vivamus vitae urna ante. Donec maximus arcu vitae augue vestibulum interdum. Sed congue ullamcorper tellus, ac pretium metus porta finibus. Cras ultricies ligula libero, eget dapibus risus rutrum in. Morbi maximus lobortis purus a pharetra. Nulla et luctus orci, ac euismod elit. Nunc fermentum ligula sed purus suscipit, quis consectetur enim sodales. Etiam eu gravida ipsum. Cras posuere tortor ac orci sagittis dapibus. Proin quis tristique quam.

Nunc id mi in tortor pulvinar mollis a quis lacus. Pellentesque blandit aliquam purus at vestibulum. Praesent facilisis nisl quis risus laoreet, vel varius diam suscipit. Ut ullamcorper elit eget ornare sollicitudin. Praesent in sagittis ex. Morbi eget quam hendrerit, accumsan sapien interdum, vehicula lectus. Nam nunc mi, pulvinar eu tincidunt et, eleifend sit amet nunc. Proin quis ex vitae nisi sodales commodo. Nulla facilisi. Donec sit amet risus et leo efficitur tristique non id lacus. Donec iaculis commodo lorem, eu laoreet dolor faucibus eget. Vestibulum ligula metus, malesuada quis justo in, molestie ultricies dui. Nulla ultricies tellus et enim condimentum iaculis. Sed imperdiet nibh sed orci facilisis, et vestibulum diam molestie. Sed turpis arcu, lobortis ac consectetur sed, dapibus quis massa. 
"
:while (true) do={
    :local fileName [$getFileName $counter]
    :local startTime [$EpochTime]

    if ([/file find name=$fileName] = "") do={
        :local endTime [$EpochTime]
        # Calculating the difference between the start and end time
        :local execTime ($endTime - $startTime)
        :put ("[*] Time taken for file check: $execTime seconds")

        :put ("[*] $fileName file size: $[:len $content] bytes")
        /file add name=$fileName contents=$content
    }
    :set counter ($counter + 1)
}

test/toto1001.json
[*] Time taken for file check: 1 seconds
[...]
test/toto2001.json
[*] Time taken for file check: 3 seconds
[...]
test/toto3001.json
[*] Time taken for file check: 5 seconds
[...]
test/toto4001.json
[*] Time taken for file check: 7 seconds
[...]
test/toto5001.json
[*] Time taken for file check: 9 seconds
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12025
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: [/file find name=$fileName] using 100% CPU usage

Wed Oct 18, 2023 6:18 pm

I really don't like your approach at all.


Find the first available (not used) number between a file sequence:

find first free code

:global firstFree do={
    /file
    :local sFName  "$1"
    :local lsFName [:len $sFName]
    :local eFName  "$2"
    :local leFName [:len $eFName]
    :local fName   ""
    :local fArray  [:toarray ""]
    :local fIndex  0
    :local sResult [print as-value where name~"^$sFName[0-9]{1,10}$eFName\$"]
    :foreach item in=$sResult do={
        :set fName  ($item->"name")
        :set fIndex [:tonum [:pick $fName $lsFName ([:len $fName] - $leFName)]]
        :set ($fArray->$fIndex) true
    }
    :for index from=0 to=([:len $fArray] - 1) step=1 do={
        :if ($fArray->$index) do={
# debug            :put "File $sFName$index$eFName already exist."
        } else={
# debug            :put "File $sFName$index$eFName do not exist."
            :return $index
        }
    }
    :return 0
}

:put [$firstFree "test-" ".txt"]
for example if exist test-0.txt, test-1.txt, test-2.txt, test-8.txt and test-10.txt
is returned 3



Find the first available (not used) number after a file sequence:

find after next busy code

:global afterNextBusy do={
    /file
    :local sFName  "$1"
    :local lsFName [:len $sFName]
    :local eFName  "$2"
    :local leFName [:len $eFName]
    :local fName   ""
    :local fArray  [:toarray ""]
    :local fIndex  0
    :local sResult [print as-value where name~"^$sFName[0-9]{1,10}$eFName\$"]
    :foreach item in=$sResult do={
        :set fName  ($item->"name")
        :set fIndex [:tonum [:pick $fName $lsFName ([:len $fName] - $leFName)]]
        :set ($fArray->$fIndex) true
    }
    :return [:len $fArray]
}

:put [$afterNextBusy "test-" ".txt"]
for example if exist test-0.txt, test-1.txt, test-2.txt, test-8.txt and test-10.txt
is returned 11



Return one array with all numbers of file presents.

return all file numbers array code

:global findAllFiles do={
    /file
    :local sFName  "$1"
    :local lsFName [:len $sFName]
    :local eFName  "$2"
    :local leFName [:len $eFName]
    :local fName   ""
    :local fArray  [:toarray ""]
    :local fIndex  0
    :local sResult [print as-value where name~"^$sFName[0-9]{1,10}$eFName\$"]
    :foreach item in=$sResult do={
        :set fName  ($item->"name")
        :set fIndex [:tonum [:pick $fName $lsFName ([:len $fName] - $leFName)]]
        :set ($fArray->$fIndex) true
    }
    :return $fArray
}


{
:local fileSearch [$findAllFiles "test-" ".txt"]
    :for index from=0 to=([:len $fileSearch] - 1) step=1 do={
        :if ($fileSearch->$index) do={
            :put "File test-$index.txt already exist."
        } else={
            :put "File test-$index.txt do not exist."
        }
    }
}
For check if a file exist, just check if the ID on the array is true, for example supposing that only exist test-0.txt, test-1.txt, test-2.txt, test-8.txt and test-10.txt

example code

{
:local fileSearch [$findAllFiles "test-" ".txt"]
:put ($fileSearch->3)  # return true
:put ($fileSearch->11) # return nil
}
Tested with one SXT G-5HPacD r2 with 5000 files, results with less than 1 second. (Since is RouterOS v6, it took longer to create the 5000 files...)
Last edited by rextended on Wed Oct 18, 2023 7:47 pm, edited 3 times in total.
 
optio
Long time Member
Long time Member
Posts: 686
Joined: Mon Dec 26, 2022 2:57 pm

Re: [/file find name=$fileName] using 100% CPU usage

Wed Oct 18, 2023 7:19 pm

@optio HTTP SSH doesn't work here. I am saving files when the router is offline.
I get that, but when is back online you can trigger some event to remote to download files (over ssh for eg...) and not sent them as content from ROS. If you don't need to send them to remote, if you download them manually, then I don't see reason to find file if is just for writing...
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12025
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: [/file find name=$fileName] using 100% CPU usage

Wed Oct 18, 2023 7:36 pm

@rextended For me, it's kind of normal to check if a filename already exists before creating a file.
Thanks for explaining things to me... :lol:
RouterOS is not a programming language, so sometimes it's best to avoid using any form of [find] in large lists....

So, your workaround might work initially, but if for any reason the file already exists (unplug, hard reboot, reset file last index, your file isn't updated with the last index, etc.),
then we could face a "File already exists" error and the process would end.
I don't want to argue about this, it's useless. Also because certainly if the file is created before the index is updated there is certainly a logic error in the process.
Restarting the machine should not affect it. If it does, it means it was poorly designed.

Instead of writing nonsense, always save the files with unix time epoch and it will be very difficult for two files to have the same name.

v7 only example code

:global datetime2epoch do={
    :local dtime [:tostr $1]
    /system clock
    :local cyear [get date] ; :if ($cyear ~ "....-..-..") do={:set cyear [:pick $cyear 0 4]} else={:set cyear [:pick $cyear 7 11]}
    :if (([:len $dtime] = 10) or ([:len $dtime] = 11)) do={:set dtime "$dtime 00:00:00"}
    :if ([:len $dtime] = 15) do={:set dtime "$[:pick $dtime 0 6]/$cyear $[:pick $dtime 7 15]"}
    :if ([:len $dtime] = 14) do={:set dtime "$cyear-$[:pick $dtime 0 5] $[:pick $dtime 6 14]"}
    :if ([:len $dtime] =  8) do={:set dtime "$[get date] $dtime"}
    :if ([:tostr $1] = "") do={:set dtime ("$[get date] $[get time]")}
    :local vdate [:pick $dtime 0 [:find $dtime " " -1]]
    :local vtime [:pick $dtime ([:find $dtime " " -1] + 1) [:len $dtime]]
    :local vgmt  [get gmt-offset]; :if ($vgmt > 0x7FFFFFFF) do={:set vgmt ($vgmt - 0x100000000)}
    :if ($vgmt < 0) do={:set vgmt ($vgmt * -1)}
    :local arrm  [:toarray "0,0,31,59,90,120,151,181,212,243,273,304,334"]
    :local vdoff [:toarray "0,4,5,7,8,10"]
    :local MM    [:pick $vdate ($vdoff->2) ($vdoff->3)]
    :local M     [:tonum $MM]
    :if ($vdate ~ ".../../....") do={
        :set vdoff [:toarray "7,11,1,3,4,6"]
        :set M     ([:find "xxanebarprayunulugepctovecANEBARPRAYUNULUGEPCTOVEC" [:pick $vdate ($vdoff->2) ($vdoff->3)] -1] / 2)
        :if ($M>12) do={:set M ($M - 12)}
    }
    :local yyyy  [:pick $vdate ($vdoff->0) ($vdoff->1)] ; :if ((($yyyy - 1968) % 4) = 0) do={:set ($arrm->1) -1; :set ($arrm->2) 30}
    :local totd  ((($yyyy - 1970) * 365) + (($yyyy - 1968) / 4) + ($arrm->$M) + ([:pick $vdate ($vdoff->4) ($vdoff->5)] - 1))
    :return      (((((($totd * 24) + [:pick $vtime 0 2]) * 60) + [:pick $vtime 3 5]) * 60) + [:pick $vtime 6 8] - $vgmt)
}

{
:local fileIndex [$datetime2epoch]
/file add name="test-$fileIndex.txt" contents=$fileIndex
}
 
User avatar
Phenek
just joined
Topic Author
Posts: 6
Joined: Tue Oct 03, 2023 11:05 am

Re: [/file find name=$fileName] using 100% CPU usage

Thu Oct 19, 2023 12:22 pm

@optio I am saving json files only on offline mode, when it's reconnected I send json to my Mqtt Server with the command
/iot mqtt publish broker=$brokerName topic="$brokerTopic" message=$fileContent
Here fileContent is limited to 4096 Bytes, but it's just 12bits problem..

@rextended thanks for your time.

I don't really need to convert dateTime to epoch because BLE advertisements give me already the epoch like this.
And I took the last advertisements for my filename, it's less compute. (so I am fine with this)
:local advertisements [/iot bluetooth scanners advertisements print detail as-value
    :if ([:len $advertisements] > 0) do={
        :foreach adv in=$advertisements do={
            :local data ($adv->"data")
            :local epoch ($adv->"epoch")
My real files names are like so:
2023-10-09-13:02:41-8580.json
2023-10-09-13:02:41-8581.json

I tried some of your code with 4900 files:


#find first free code
It does not work with more than 4096files on the DD.
[...]
File test/toto805.json already exist.
File test/toto806.json do not exist.
[*] Time taken for file check: 25 seconds

#find after next busy code
[*] Time taken for file check: 23 seconds
[*] test/toto4901.json
[*] Time taken for file check: 21 seconds
[*] test/toto4902.json

#return all file numbers array code
This one could work, but still take 100% CPU usage when initialize
[*] Time taken to initialize array code: 35 seconds
[*] Time taken for file check: 1 seconds
[*] test/toto4903.json

Ok but you know,
I wonder why the command /file add name="test/toto4000.json" contents="toto"
give me "file already exists", and it end my process in less than a second.

Conclusion the command /file add under the hood do a better check it self, even with 5000files.
I wonder where is the real nonsense here, does it come from me??
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12025
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: [/file find name=$fileName] using 100% CPU usage

Thu Oct 19, 2023 5:28 pm

I can help you by proposing alternative solutions, but not for debugging RouterOS. It was not born as a NAS.
Create subfolders for each year/month/day/hour/minute and you will see that the search will always be near instantaneous...
Unless you create 1000 files per second...

I repeat, your approach seems wrong to me.

But does this file 2023-10-09-13:02:41-8580.json mean that you create at least 8580 files every second?
What do you care about all this bullshit if you can directly use the unix time of bluetooth to sort the files, and create for sure unique filename...
 
optio
Long time Member
Long time Member
Posts: 686
Joined: Mon Dec 26, 2022 2:57 pm

Re: [/file find name=$fileName] using 100% CPU usage

Thu Oct 19, 2023 6:36 pm

@Phenek You can use index file with created filenames, on each save you can append filename into index file, for loading content to send message over mqtt iterate index file and load files using filename from that file. If you are splitting message content into multiple segments per second you will need to track that segment number also using global var or some file. In that way you don't need to find last written file on fs. If you filenames index file can reach 4096B limit then you will need segment this file also and track that segment number, its a bit complex algorithm for all this but probably it will run faster than simply finding files on fs with that large number of files in dir.
Last edited by optio on Thu Oct 19, 2023 7:07 pm, edited 1 time in total.
 
User avatar
Amm0
Forum Guru
Forum Guru
Posts: 3557
Joined: Sun May 01, 2016 7:12 pm
Location: California

Re: [/file find name=$fileName] using 100% CPU usage

Thu Oct 19, 2023 7:06 pm

I can help you by proposing alternative solutions, but not for debugging RouterOS. It was not born as a NAS.
Yeah every command (and output), or even keystroke, goes through way more processing than a typical shell. There just no way to "speed up" the "find"... for whatever reason it is just a intensive operation today...

A different approach use exception handling instead. Since the create in the normal path should work fine, so the check is superfluous most of the time. So perhaps just using a :do { ... } on-error={ ... } instead, and just make the on-error sort out why there was a dup file name.
 
User avatar
Phenek
just joined
Topic Author
Posts: 6
Joined: Tue Oct 03, 2023 11:05 am

Re: [/file find name=$fileName] using 100% CPU usage

Fri Oct 27, 2023 12:17 pm

Thx @Amm0,
Unfortunately, the command below is interrupted and doesn't process the on-error scope:
:do { /file add name=$fileName contents=$content } on-error={:put "skip"}
[*] test/toto0.json file size: 3920 bytes
interrupted
file already exists

@optio yeah, we tried this solution, and it is better, but after a while, the command /file add name=$fileName contents=$content starts to take some time.
And in the end, it uses 100% CPU usage.

@rextended
- 2023-10-09-13:02:41-8580.json is an iso 8601 standart date file name, you can see here the miliseconds at the end of the files.
I get that from the epoch of the last BLE advertisement ending the file, around 4096 bytes.
But some time I got file already exist... interrupted process.
- Creating subfolders for each year/month/day/hour/minute did not help. With too many files RouterOS is meltdown.

Our team has analyzed the limitations of RouterOS. MQTT messages are limited to 4096 bytes, and files are limited in number and variable size (inside the Mikrotik environment).
Check if a file exist, and read read/write large files should be common in any language script. I know you will tell me, it is possible.
But we are not here about re-coding the wheel, (and unit-testing the wheel..) if it is possible then it should be native.

100% CPU usage is dangerous for us because we lose some BLE advertisements in the air.

So, thank you for your time. We will not proceed further this year with RouterOS.
Every year we have about a thousand of routers to change for maintenance,
so maybe we will check again about Mikrotik and it's limitations (I will).

Best wishes,
And thank you again :)
 
optio
Long time Member
Long time Member
Posts: 686
Joined: Mon Dec 26, 2022 2:57 pm

Re: [/file find name=$fileName] using 100% CPU usage

Fri Oct 27, 2023 3:19 pm

And in the end, it uses 100% CPU usage.
Then you can try to segment file writes to separate directories with directory index names 0, 1, 2, 3... Limit number of files per directory, when limit reaches create new one and write into it. Also last index and its files count need to be tracked so that don't need to find count of last directory index when performing new file write.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12025
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: [/file find name=$fileName] using 100% CPU usage

Sat Oct 28, 2023 1:05 pm

@rextended
- 2023-10-09-13:02:41-8580.json is an iso 8601 standart date file name, you can see here the miliseconds at the end of the files.
False.
2023-10-09-13:02:41-8580 = 2023-10-09 13:02:41 with negative TimeZone "-85:80" (3 days, 14 hours and 20 minutes)
The correct format for milliseconds is
"2023-10-09T13:02:41.858000Z"
If you read the string directly, is a misrapresentation of the correct format...
(T can be replaced with "space", not with a "-", and Z [or the time zone], can be omitted)
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12025
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: [/file find name=$fileName] using 100% CPU usage

Sat Oct 28, 2023 1:23 pm

Then you can try to segment file writes to separate directories with directory index names 0, 1, 2, 3...

Create subfolders for each year/month/day/hour/minute and you will see that the search will always be near instantaneous...
 
optio
Long time Member
Long time Member
Posts: 686
Joined: Mon Dec 26, 2022 2:57 pm

Re: [/file find name=$fileName] using 100% CPU usage

Sat Oct 28, 2023 1:30 pm

Create subfolders for each year/month/day/hour/minute and you will see that the search will always be near instantaneous...
missed that, this will create a lot of directories, but yes it easier to do like that than track directory segment index
 
User avatar
Amm0
Forum Guru
Forum Guru
Posts: 3557
Joined: Sun May 01, 2016 7:12 pm
Location: California

Re: [/file find name=$fileName] using 100% CPU usage

Tue Oct 31, 2023 4:09 am

There was a reference to a bug fix in 7.12rc4 that was vague enough that might be related to the "slow" find – but just a guess:
system - fixed process multithreading (introduced in v7.9);
viewtopic.php?t=200328#p1032962

Who is online

Users browsing this forum: Amazon [Bot] and 12 guests