I've done a lot of these things. I've programmed dozens of different versions, including several versions which made new versions. The problem is simple:
Your social structures are based on your reality.
As an obvious example: most people don't feel real comfortable strutting in the buff. Why?
There's a hundred different theories. I won't discuss them in this post. But the inevitable conclusion is that it hinges around you having a buff to strut around in. Moreover, around you having a strut to strut with. And the fact that there are other people who share your strutting and buffing nature, so can judge it.
It's not inborn. Young children don't seem to have a predilection to clothes - quite the opposite. You'll find most "primitives" also tend towards a rather minimalist wardrobe. If they fear their flesh being shown to millions of horny teens pawing through National Geographic, it's not because of their flesh, it's some weird religious thing about having your soul sucked out.
A computer simulation can only simulate the things it can simulate. Durrrr. And no computer I've ever seen can simulate human life with anything approaching fidelity. Meaning: even if you COULD get a "thinking AI", you can't expect it to share our cultural foibles. It has entirely different innie and outie bits. Whereas we might blush being caught surfing porn sites, it might "blush" being caught using Appletalk, or reading up on radiation. Dirty, dirty radiation. Electrons turn me on!
Moreover, much of our culture arises from difficult situations and the contrast with luxury. When stripped of a need to provide for themselves, humans seem to inevitably gravitate towards sex and signs of power. Assuming the social algorithm was at all accurate, you'd probably be living in a world filled with some of the scariest, most perverse culture you could imagine. Electrons everywhere. Ewwwww.
This has, in turn, given social simulations more than a black eye: it has cut off their arms and legs. Not only do all the possible interactions have to be programmed in, but all the cultural weights we lend them, too. It doesn't evolve out of the core rules: it's all special cases.
There are three ways to get a "decent" social interaction system. The first is to program in rules, such as "naked is mortifying". These rules can be set and weighted for individuals however you want. Downside being, of course, that there are millions of these, including ones like "don't stare at people on elevators" and "hip-hop is liked by many young folk, but hated by most mature adults". The upside is, of course, that you can program for any culture you like, including ones where it is a standard ritual to stare at one another on the elevator.
Of course, the OTHER downside is that this doesn't provide an 'accurate' culture. These piecemeal rules will leave strange gaps and bizarre behaviors galore.
The second way to program in rules is to let it learn the rules by parsing human interactions. This would be great, except that it requires a pattern recognizer, and we don't have one.
The third way to program in rules is to create a base set of behavior chunks, set some variables, and let it cogitate on the matter. This, of course, only creates as well as the rules are defined.
I, as always, choose the most abstract. So I've been thinking about that third method. I've got some interesting ideas, but they require diagrams to explain.
It would be pretty interesting if it worked. It probably won't, though. We'll see.