Jump to content

Why is the "128 array element limit" even a thing in the 21st century?


Recommended Posts

Thing is, if you know what you're doing, you can circumvent these limits.

Works for me.

 

For the sake of the API as such, I'd always implement getters/setters for properties that are not logically read-only. Beth has lots of stuff in Papyrus that can be set or read, but not both. Hard to test then...

Link to comment
Share on other sites

I'd always implement getters/setters for properties that are not logically read-only.

I have no idea what that means, but I am sure it makes sense ...

(How can something that "sets" the value of a property or a variable or whatever be "read only"? That makes no sense ...)

Link to comment
Share on other sites

Example: IsProtected flag on Actors (not ActorBase).

You can only set that from Papyrus, but not read it. Yet, it makes no logical sense whatsoever that this particular property of an Actor instance should be write-only.

 

If I were to dispatch a request to have the setter for this implemented, I'd have the coder implement the corresponding getter as well, in this case.

Just so that we have a nice API, you know.

And if someone wanted to write an unit test for their function that sets this flag on certain Actors, they could also read the value back without having to pull off some ass backwards stunts. Because they probably wont (takes too long, is way too expensive), which means stuff will simply remain untested.

Link to comment
Share on other sites

Example: IsProtected flag on Actors (not ActorBase).

You can only set that from Papyrus, but not read it. Yet, it makes no logical sense whatsoever that this particular property of an Actor instance should be write-only.

 

If I were to dispatch a request to have the setter for this implemented, I'd have the coder implement the corresponding getter as well, in this case.

Just so that we have a nice API, you know.

And if someone wanted to write an unit test for their function that sets this flag on certain Actors, they could also read the value back without having to pull off some ass backwards stunts. Because they probably wont (takes too long, is way too expensive), which means stuff will simply remain untested.

Ah, ok, I get what you are saying (see what I did there?)!

 

So you meant things that only have "set" functions and no "get" functions ...

Link to comment
Share on other sites

  • 2 years later...

 

And yes, I know that the "core" of the engine has not been touched since like a decade or two :laugh:.

 

I don't think that it is correct. Take a look to the changes in scripting engine made for Fallout 4. And follow the links %)

 

 

About reasons of 128 elements limit.

We can only speculate here as there is no official notice (at least I didn't found anything from dev team)

 

Why 128, but not 256?

One of the potential reasons was mentioned by aurreth - errors check. Negative numbers are invalid (see the difference between signed an unsigned integer types). Looks like a kind of "over" reinsurance. It is hard to predict potential problems, so it is easier just to play more safe.

 

Why the limit is so low?

Some of the reasonable assumptions mentioned by SSK50 - prevent potential money loss. It will be necessary to invest more time ( read "money") to test impact of a higher array size limit.

128 elements can be viewed and tested manually, 32K/64K (16 bits) elements is extremely hard to review manually.

Development of auto-tests is considered as "too expensive" by unqualified managers. There are much more unqualified personnel than qualified ones. So it is easy to imagine that Bethesda uses less qualified managers that you can expect %)

Developers are people, managers are people, business owners wants to reduce costs... and they are people %) They tries to do their best, but software development is the area of a huge amount of uknowns. To reduce the risk of failure (read "money loss") people tries to make "safe" decisions on their level of competence and confidence.

 

I'm personally do not see any technical problems with 16 bits for array size, but 32 bits looks dangerous.

Imagine, some script developer made a mistake a wrote a code that adds elements to an array infinitely. 32 bits means 2(or 4) billions of elements, each elements is at least 1 byte size (more likely 4 bytes or even higher), so in case of such mistake game will crash or freeze due to lack of free RAM. Single mistake may cause a huge impact on the game process globally. If the limit is low, memory loss is insignificant, so it doesn't affect full game process. Mistake still cause a problem, but it is local problem, not global.

 

I understand that it might not be an answer you can expect, but I would prefer to stop speculating %)

Link to comment
Share on other sites

The writer of the papyrus language explicitly stated on the old Bethesda forum that the reason for the small limit is so that mod author would not create huge loops through their arrays in their mods that could affect performances. :)

Link to comment
Share on other sites

The writer of the papyrus language explicitly stated on the old Bethesda forum that the reason for the small limit is so that mod author would not create huge loops through their arrays in their mods that could affect performances. :smile:

 

Do you know performance of what part of the system was mentioned? Do you have a full quote?

Link to comment
Share on other sites

Entirely self-imposed as mentioned. It doesn't seem like a wise decision but we don't have all the context. One thing about arrays vis a vis papyrus is that arrays are reference objects, there's no papyrus-end destructor and c++ also isn't governed by a garbage collector, so the freeing of memory has to happen explicitly in code.
Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...