1

I have the following script:

import bpy
import os

print("Starter")

selection = bpy.context.selected_objects

for obj in selection:

    print("Obj selected")

    me = obj.data

    for edge in me.edges:

        vert1 = me.vertices[edge.vertices[0]]
       vert2 = me.vertices[edge.vertices[1]]
       print("<boundingLine p1=\"{0}f,0.0f,{1}f,1.0f\" p2=\"{2}f,0.0f,{3}f,1.0f\" />".format(vert1.co.x, vert1.co.y, vert2.co.x, vert2.co.y))       

Pretty basic, right? It just prints out all the edges into the console, for me to copy paste into an xml document. When I scale an object, and perform this script on the object, I get the OLD, unscaled values for the object outputed to the console, before it was scaled. I have tried moving every vertice in the object in all axises, which results in the values outputed being those outscaled and then transformed according to my movement.

If i press n to check the vertices global values, they are properly scaled.

Why am I not getting the correct values?!?

This script was supposed to save time, but getting anything to work in blender is a CHORE! It does not help that they has just updated their api, so all example code out there is outdated!

Apeforce
  • 73
  • 1
  • 10
  • It appears that I am getting the "local" values for the vertices, why would I want those? Why is there no part of the api that states whether the values you are accessing are global or local? – Apeforce Sep 30 '12 at 21:27

3 Answers3

2

Allright, this is the deal: when you scale, translate or rotate an object in Blender, or otherwise perform an transformation, that transformation is "stored" somehow. What you need to do I choose the object of which you applied the transformation, and use the short cut CTRL + A, and then apply your transformation.

...

So there was no lack of contingency (am I using this word right? Checked out it's definition and it seems right) between the internal data accessible through the blender api, and the values actually displayed.

I am sure this design makes sense, but right now I want to punch the guy that came up with it, in the throat. If I scale something, I intend the thing that got scaled to be scaled!

But anyways, the reason I got weird values was because the scaling was not applied, which you do with CTRL + A, once you in object mode have selected the object that you scaled.

Apeforce
  • 73
  • 1
  • 10
0

I`m not really a Blender user(but a Maya one), I think you could try something different(I woulds say slower too...), just iterate over the selected vertices, creating a locator or a null object and constraining it to the vertex position and getting it's x,y,z coordinates. I've done it in maya and works.

Lets say something like this:

data_list = []

selection = #selection code here#

for v in selection:

    loc = locator()
    pointconstraint(v, loc)
    data_list.append(loc.translation_attributes)
Netwave
  • 40,134
  • 6
  • 50
  • 93
  • Thanks for the reply, I did however find the answer shortly after posting the question, but I could not upload it because of the 8 hour limit. I am uploading the answer right now. – Apeforce Oct 01 '12 at 07:45
0

Mesh objects have an internal coordinate system for their vertices, as well as global translation, scaling, and rotation transforms that apply to the entire object. You can apply the global scaling matrix to the mesh data, and convert the vertex coordinates to the global coordinate system as follows:

bpy.ops.object.select_all(action='SELECT')
bpy.ops.object.transform_apply(scale=True)
bpy.ops.object.select_all(action='DESELECT')

Other options to transform_apply() allow rotation and translation matrices to be applied as well.

Brent Baccala
  • 987
  • 1
  • 6
  • 18