Sampling the colour values of a pixel on a 2D texture is easy enough in Maya; there is the MEL command 'colorAtPoint', and you can emit particles from a texture and query the RGB values per particle. Sampling a colour value of a 3D texture is not so straightforward, and the approach suggested in the documentation is to bake your 3D texture to an image file and use the aforementioned methods to query colour values on the resulting texture.
In this post, I'll demonstrate how to implement an API function that samples a 3D texture at any given XYZ point, and returns an RGB value.
Put simply, I want to know the RGB value of a 3D texture at a given XYZ point.
The Maya API class MRenderUtil has a function called sampleShadingNetwork that "allows you to sample a shading node/shading engine.You can specify the location and property of the sample point, and the method will return the result color and transparency".
The description of the function continues : "If you specify a shading node to be evaluated, you'll need to provide the attribute to be evaluated. For example, valid shading nodes are checker1.outAlpha, file1.outcolors, etc.."
marble1.outColor ? Why not ?
1) sampling a single point:
Let's begin by creating a sphere and 3D marble texture node. So right now in your viewport, you should see a grey sphere and a 3D Texture placement node.
The following script will read the sphere's position, sample the node "marble1.outColor" at that location, and set the sphere's colour to the returned values (you can download this script later; I wouldn't recommend copy/pasting from here...).
Open the Script Editor and begin by importing the necessary Python modules:
import maya.OpenMaya as om
import maya.OpenMayaRender as omr
import maya.cmds as cmds
Now, let's being the script with a couple of variables to store our point positions and the corresponding RGB values.
pointArray =  # array of XYZ values
sampledColors =  # array of sampled RGB values
# get the location of our locator and append to pointArray
locatorPos = cmds.xform("locator1", query=True, ws=True, rp=True)
# sample the points stored in pointArray - definition of sampleColorAtPoint to follow
sampledColors = sampleColorAtPoint("marble1.outColor", pointArray)
# change the color of the sphere...
cmds.setAttr("lambert1.color", sampledColors, sampledColors, sampledColors, type = "double3")
All that is left is to define our sampling procedure.The list of arguments required by MRenderUtil.sampleShadingNetwork() looks daunting, but it's not. You won't need to provide all the arguments to sample a 3D texture. As per the docs: "In general [...] 3D textures require points and refPoints"....
def sampleColorAtPoint(shadingNode, points):
# these are the arguments required to sample a 3D texture:
shadingNodeAttr = shadingNode
numSamples = len(points)
pointArray = om.MFloatPointArray()
refPointArray = om.MFloatPointArray()
for i in range(len(points)):
point = points[i]
location = om.MFloatPoint(point, point, point)
# we don't need to set these arguments
useShadowMap = False
reuseMaps = False
uCoords = None
vCoords = None
normals = None
tangentUs = None
tangentVs = None
filterSizes = None
cameraMatrix = om.MFloatMatrix()
# create the return arguments
resultColors = om.MFloatVectorArray()
resultTransparencies = om.MFloatVectorArray()
# and this is the call to sample the points
omr.MRenderUtil.sampleShadingNetwork(shadingNodeAttr, numSamples, useShadowMaps, reuseMaps, cameraMatrix, pointArray, uCoords, vCoords, normals, refPointArray, tangentUs, tangentVs, filterSizes, resultColors, resultTransparencies)
# take a breath..
# and return the sampled colors as a list
sampledColorsArray = 
for i in range(resultColors.length()):
resultVector = om.MFloatVector(resultColors[i])
rgb = [resultVector.x, resultVector.y, resultVector.z]
# end of sampleColorAtPoint
That wasn't so bad, was it ?
You'll notice that my procedure takes in an array of points so you can pass in as many points as you like and they will be sampled at the same time. In fact, this is quicker than calling OpenMayaRender.MRenderUtil.sampleShadingNetwork() for each point.
2) On to particles..
How about applying this script to a particle object ? It's really not that difficult, once you know how to stuff the return RGB values back into the particle rgbPP array using MFnParticleSystem()
Here are the steps:
1) Create a 3D texture node, eg: marble
2) Create a grid of particles. I find this is the best way to preview the script
3) Add an rgbPP attribute to the particle object
5) Load SampleShadingNetworkParticles into the Script Editor, and run it – it imports the necessary python modules, defines the sampling procedure then calls the procedure to apply "marble1.color” to “particleShape1”.
If you want to take this further and have the particle color update per frame, then I suggest you call the procedure once in a after-dynamics runtime expression for the last particle only - so you only sample the colors once per frame.
Have fun !