Cs.write_file() strange outcomes

let output = cs.write_file(cs.special_record_id(cs.ref(‘sp_partner_export_default’)), cs.ref(‘fld_partner_export_file’), “partners.json”, outputString, “raw”);

Using the above method I am creating an output file as a record. All good so far. Im outputting (validated) JSON content but it’s just text for the purpose of the file content.

With a small amount of content I don’t seem to have an issue but if I go over a certain amount of records I get an empty file being created.

Despite getting a true response from the write I can only assume the write is actually internal erroring somewhere - maybe there is something in the content it doesn’t like although I struggle to see what given its just text.

Now in the detective I output the content value directly before the write method and it has lots of JSON (text) in it.

So why am I getting an empty file ?

Given I have created multi mb files using this approach in the past - something is up :stuck_out_tongue_winking_eye:

And no - im not yet approaching the file size limit of the platform nor is the disk full or anything like that.

At the moment im only working with small mb files (< 10mb while testing) but im wondering if instead of a single write I should maybe append() per record … I wonder if that would make a difference accepting the performance hit ?

May not be the issue at all here, but the options param to write_file should be an object, in this case:
{raw: true}
I note the help for this param is not clear so we’ll get that updated.

Ill take that on board and check whether it makes any difference.

What I will say is that appending (per record) does SEEM to work but the performance hit of doing it that way is not really acceptable :stuck_out_tongue_winking_eye:

Also worth noting that you shouldn’t need the raw option at all (I’m not clear on a use case for that tbh), plain text content doesn’t need any extra options.

How large is the content you are writing? And you said it seems to be an issue when you go over a certain number of records, rather than content size, can you elaborate on the records bit here?

I’m not sure there is a hard limit on content size, it is probabyl more to do with the memory limit of the JS engine so will depend what else your code is doing.

Yes the optional property doesn’t seem to make too much difference either way.

Got to be honest its really strange.

Im not even sure how to explain what I’m seeing at the moment :stuck_out_tongue_winking_eye:

Its not size related best I can tell - it seems like under certain circumstances its something in the constructed text causing the issue.

If I simplify my output to down to just the direct properties and attributes of the record - no issue outputting the file at all - even with 100s of records (several mb file created).

The problems seem to begin when I start using the relationships or other references to get additional data that the problems begin which might suggest an application / code problem BUT ultimately all im outputting into the file is text. And the CS component runs from start to finish with no discernible issues or errors.

its definitely something to do with the text content. So in constructing a JSON structure that has fields but no content values then all good no matter how many records im outputting.

Im suspecting hidden or invisible characters maybe blowing up the write functions ?

so with all the keys but empty values

yes so if I escape() the content values then all good - but obviously this is not an ideal solution but at least I know what’s going on now !!!

So the question is why does it cause the write() and append() functions to barf :thinking:

It’s a good question. Did you narrow down any particular characters or values that are problematic? Or have some example code that can reproduce it reliably?

not specifically but

function customReplace(str) {
str = str.replace(/[^\x20-\x7E]/g, “”);
return str;
}

fixes it

This at least works - I’ve tested it with a couple of thousand records creating a file roughly 5 mb and it all checks out now no issues.