[NetBehaviour] Tate Make Their Collection Metadata Free-as-in-Freedom

Rob Myers rob at robmyers.org
Mon Nov 11 19:32:13 CET 2013

Was it conceptual art?

James Morris <jwm.art.net at gmail.com> wrote:
>I got a blank one last night, no artwork, no categories, just the
>asking me if I see _ and _.
>On Nov 10, 2013 12:29 PM, "shardcore" <shardcore at shardcore.org> wrote:
>> Thanks for the kind works, everyone.
>> autoserota is really just the baseline model of what I'd like to do
>this dataset.
>> As I mentioned in the blogpost, the most interesting categories, for
>are the more subjective ones, the categories which feel like they're
>furthest along the 'I need a human to make this judgement' axis. This
>dataset goes beyond simple 'fact based' descriptions, which means it
>contains a whole lot more humanity than most 'big data'.
>> We can imagine machines which spot the items within a
>work (look at Google Goggles, for example) but algorithms which spot
>'emotions and human qualities' of a work are more difficult to
>These categories capture complex, uniquely human judgements which
>occupy a
>space which we hold outside of simple visual perception. In fact I
>I'd find a machine which could accurately classify an artwork in this
>way a
>little sinister...
>> The relationships between these categories and the works are
>in nature, allusions to whole classes of human experience that cannot
>derived from simply 'looking at' the artwork. The exciting part of the
>data is really the 'humanity' it contains, something absolutely
>when we're talking about art - after all, culture cannot exist without
>culturally informed entities experiencing it.
>> This data is represented in JSON, it's been expressed in a
>machine-readable form explicitly for algorithmic manipulation. It gives
>a fascinating opportunity to investigate how machines can navigate a
>cultural space, precisely because it's been imbued with 'cultural
>knowledge' by the hard-working taggers of The Tate.
>> On 9 Nov 2013, at 23:30, Rob Myers wrote:
>> > On 09/11/13 03:07 PM, Bjørn Magnhildøen wrote:
>> >> "can you see the $x and the $y?"
>> >
>> > Yes it's very simple but the effect of framing it as a question
>makes it
>> > very effective IMO. :-)
>> >
>> >> i'd like to do something with the categories themselves,
>> >> how the concepts surrounds and defines the works
>> >> thinking of an descriptive art from it
>> >> or instructive art
>> >> then mutated art maybe
>> >> algorithmic selection
>> >> "as stated", dictated, mutated
>> >
>> > Definitely. Once we know how existing objects are described we can
>> > describe objects that don't yet exist. :-)
>> >
>> > _______________________________________________
>> > NetBehaviour mailing list
>> > NetBehaviour at netbehaviour.org
>> > http://www.netbehaviour.org/mailman/listinfo/netbehaviour
>> _______________________________________________
>> NetBehaviour mailing list
>> NetBehaviour at netbehaviour.org
>> http://www.netbehaviour.org/mailman/listinfo/netbehaviour
>NetBehaviour mailing list
>NetBehaviour at netbehaviour.org

Sent from my Android device with K-9 Mail. Please excuse my brevity.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.netbehaviour.org/pipermail/netbehaviour/attachments/20131111/3e4e0039/attachment.htm>

More information about the NetBehaviour mailing list