Half of this question may apply to other language bindings with built-in garbage collectors; however, the other half is excluse (I think) to gjs since it refers to the gnome shell.
I was recently made aware that the destroy method has different meanings in the gjs/gtk world; and that, while most glib objects’ resources are cleaned up upon losing all references to it; others like Gtk.Window are responsability of the developer to destroy.
Another example of this phenomenon happens in St.Widgets in the gjs/shell world (from what I’ve seen in code of extensions). Is there an exhaustive list or a rule of thumb to know when it is responsability of the developer to destroy an object when it is no longer needed and when the developer shouldn’t do that besides looking at previous code?
The rule of thumb is generally to just read the documentation. The documentation for Gtk.Window mentions that GTK itself owns the last reference, for example.
In the general case, you should avoid destroy() when possible, since it is a feature-ful complication to reference counting. Usually reference counting is very simple: start with a reference count of 1 and when it reaches 0 free it. This becomes complicated and burdensome when you have trees of widgets, some of which may depend indirectly on each other (e.g. selectable rows associated with a stack or pages) or require circular references.
In most implementations the destroysignal is emitted just before an object is freed. destroy()functions usually exist to bypass the reference count and jump right to 0, resulting in a destroy signal emission. When an object emits destroy, you must drop any reference you hold; it is now invalid and attempting to use it may result in a segfault (if GJS didn’t catch these things).
This is the general principle of destroy() functions and destroy signals in GObject libraries, but they are just conventions and the logic sometimes breaks down with objects like GtkWindow (or in your case GdkSurface). Whenever possible allow GJS to handle the final reference drop, and report when the “obvious” way of doing things results in errors.
You can (and sometimes have to) use destroy functionality, but remember this almost always comes with more work like handling the destroy signal correctly and not mistakenly calling destroy() twice or on an object after destroy has been emitted.
Thank you for your reply. I understand now why a top level Gtk.Window needs to be destroyed; however, why a Clutter.Actor needs to be destroyed instead of just removing it from its parent and losing all other references to it?
No, when the reference count reaches 0, the actor will be freed.
Before it is finally freed, it will always emit a destroy signal. If you want to destroy an actor while something else is holding a reference (e.g. a St.BoxLayout container), you can call child.destroy() and the St.BoxLayout will be watching the signal and remove the actor releasing it’s reference.
If you call boxLayout.remove_child(child) and boxLayout holds the last reference, after it is removed the child’s reference count will drop to 0 and destroy will be emitted.
Oh, I see, so when extensions call destroy on created widgets when been disabled is just to ensure that any other lingering reference is freed (if they were appropriately listening for destroy signals everywhere else they had other references to it), right?