Unfortunately, this does not prevent "DAG zero..." from appearing
period. Rather, it just overwrites any junk printed to the console
during the export. The ANSI version is rather limited compared to the
Windows version and completely untested...
Due to memory management issues, it is likely impossible to remove a
node in its own init() callback without crashing Blender. Now, we will
deactivate any output node operators if an output node is already
present in the tree.
Recall that PFMs are added as modules to the global modules dict.
Therefore, module names must be valid Python 2.x identifiers. This is
handled well for age names, but we've been neglecting to handle it for
PFM names. So, when Blender adds ".001" as a suffix to a duplicated
object, any attached PFMs will go down in a firey dust explosion.
Generally, with the unhelpful error message: "NULL result without
error in PyObject_Call".
This is like the Package Python operator in that the artist can now
export the loc data independent of the rest of the age. The difference
is that due to the extremely different loc formats from PotS+UAM to
MOUL, we can't easily allow the user to specify where the data will be
exported to. So, we only allow this operation in the context of a game.
Two problems were fixed here:
- Materials were always exported unconditionally, meaning that we were
wasting time processing the same data over and over. This could have
generated "interesting" data (eg multiple hsGMaterials and friends with
the same name) that Plasma would have barfed on.
- Lightmaps were being applied to the incorrect materials
See the comment for details. I've been seeing this crash since we
started doing fancy idprop stuff. Of course, my test blend has always
had bleeding edge junk and has crashed left, right, and center. For more
fun, follow the progress on D4196.
If a file's data is already available in Blender, it might be changed.
For example, an internal text datablock or a changed text file. We need
to use those overrides.
Path of the Shell did not like my fancy metaprogramming tricks for
defining an AgeSDL Python class that contained characters that are
illegal in Python identifiers. So, now, we revert to just using a
standard class declaration.
That means that we need to strip out any illegal identifiers from the
age name first. A legal Python 2.x identifier is constrained to the
ASCII alphanumeric characters and the underscore with the stipulation
that the first character cannot be a number. To illustrate this to the
artist, we alert the age name property field if an illegal character is
found in the age name. We also alert on the underscore, which is now
used as a very very special replacement character. In the case of an
illegal character, an error message is shown in the UI with the correct
AgeSDL name.
Of course, I hope no one really uses those illegal characters and this
is just more fulmination on my part...
There are some cases where errors, while bad, are not the end of the
world. I'm thinking namely about compyling the age python. The age still
exports just fine, but the ancillary data is flawed. This new system
collects nonfatal errors until the export is done, then raises them all
at once.
Version 2 of the python file node is now backed by a `bpy.types.Text`
datablock in the case of a file whose attributes are updated from a
backing file.
Implements the boilerplate code for compiling Python code in arbitrary
python versions and packing the marshalled data into Cyan's Python.pak
format. Since this is a lot of bp, a separate operator has been added
to both test the resulting mayhem and provide age creators an easy way
to export only their needed Python.
The only python that is packed currently is the age sdl hook file, if
any. In order for that part to happen, Python File nodes need to be
upgraded from having a string path to actually using the new text_id
field.
Age output files are now handled in all aspects by a singleton manager.
This allows us to track all generated files and external dependency
files and ensure they are correctly copied over to the target game... Or
not, in the case of an age/prp export from the File > Export menu.
Currently only SFX files are handled as an external dependency. TODO are
python and SDL files.
Further, because we have an output file manager, we can bundle all the
files into a zip archive for releasing the age in one step. Wow such
amazing. ^_^
Plasma game installs are a per-user config item and should not be stored
in a blend file. Considering that we will be adding more per-user
configs, namely Python 2.[2|3|7] install directories, it seems like a
good move to go ahead and move the games over.
A common error when developing with Korman and Korlib is to forget to
recompile _korlib on changes in the upstream C++ code. This will prevent
the errors from being catastrophic and will revert the user to the
python reference implementation with a minimum of fuss.
If the texture cache doesn't return images in exactly the order or way
that libHSPlasma is expecting, it raises a RuntimeError. We can detect
that we used a cached image and regenerate the data in that case...
instead of just outright failing with "image data size mismatch".
This operator takes a file as an argument and builds a cubemap from it.
Valid options are to supply the output from Plasma's
Graphics.Renderer.GrabCubeMap console command. The operator will find
the other five files and generate a cubemap with the faces saved by
Plasma. Otherwise, any arbitrary image can be supplied. If the filenames
do not fit the expected format, any missing faces will be replaced by
the face specified in the file selector. This will generally result in a
cubemap with six identical faces.
Previously, we allowed OpenGL to generate all of the mip levels for us
in a mipmap. This was pretty doggone fast and worked reasonably well.
However, with cube maps, we will need to use images that are not always
backed in Blender... this is because Blender stores cube maps as one
single image instead of one image per face. So, we need to be able to
generate those mip levels, preferably without touching Blender's
`Image.pixels`, which is slower than Christmas...
Also of note... `Image.gl_load()` will actually scale the iamge to a POT
when Blender is using OpenGL ES... but not on other platforms. So, now,
we just ask Blender to load the image and deal with the POT-izing later.
The con here is that the pure python implementation of the image scaling
function is SLOOOOOOOW. We're talking ~40 seconds to process a 1024x1024
mipmap. No one should be using the reference implementation, however,
and the C++ implementation shows no noticable slowdown over the OpenGL
code.
Whew.