I'm working through the OpenGL for iOS tutorial by Ray Wenderlich in an attempt to convert his code from Objective-C to Swift.
I am very new to OpenGL and to Swift and believe my problem has to do with how I have translated the Objective-C. Here's why:
In my swift file for setting up my view that contains OpenGL content, on the final logical step (calling glDrawElements) the app will crash with a EXC_BAD_ACCESS alert. If, however, I move this portion of the code to an Objective-C file, the app works as expected.
Swift version of this code:
var positionDataOffset: Int = 0 glVertexAttribPointer(self.positionSlot, 3 as GLint, GL_FLOAT.asUnsigned(), GLboolean.convertFromIntegerLiteral(UInt8(GL_FALSE)), VertexDataSource.sizeOfVertex(), &positionDataOffset) var colorDataOffset = (sizeof(Float) * 3) as AnyObject glVertexAttribPointer(self.positionSlot, 4 as GLint, GL_FLOAT.asUnsigned(), GLboolean.convertFromIntegerLiteral(UInt8(GL_FALSE)), VertexDataSource.sizeOfVertex(), VertexDataSource.vertexBufferOffset()) var vertexOffset: Int = 0 glDrawElements(GL_TRIANGLES.asUnsigned(), VertexDataSource.vertexCount(), GL_UNSIGNED_BYTE.asUnsigned(), &vertexOffset)
And here is the Objective-C version:
glVertexAttribPointer(position, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), 0); glVertexAttribPointer(color, 4, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid*) (sizeof(float) * 3)); glDrawElements(GL_TRIANGLES, sizeof(Indices)/sizeof(Indices[0]), GL_UNSIGNED_BYTE, 0);
As you can see, the Swift is much more verbose... I'm new to this like everyone else. :)
One other note: In the Swift version, youll see several calls to class methods on the class VertexDataSource
. Essentially, I couldn't for the life of me determine how to convert some portions of the Objective-C to swift, so decided to (FOR NOW) create a small class in Objective-C that could supply the Swift code with those attributes. Here are those methods in Objective-C:
+ (GLint)sizeOfVertex { return sizeof(Vertex); } + (GLint)sizeOfIndices { return sizeof(Indices); } + (GLint)sizeOfIndicesAtPositionZero { return sizeof(Indices[0]); } + (GLint)sizeOfVertices { return sizeof(Vertices); } + (GLvoid *)vertexBufferOffset { return (GLvoid *)(sizeof(float) * 3); } + (GLint)vertexCount { return self.sizeOfIndices / sizeof(GLubyte); }
Any help translating those lines to Swift would be amazing.
EDIT #1
As Reto Koradi pointed out, the Swift code above references self.positionSlot
twice rather than using the colorSlot. This was a mistake I made when posting the code here and not actually a mistake in my code.
So the problem still exists.
Updated Swift:
var positionDataOffset: Int = 0 glVertexAttribPointer(self.positionSlot, 3 as GLint, GL_FLOAT.asUnsigned(), GLboolean.convertFromIntegerLiteral(UInt8(GL_FALSE)), VertexDataSource.sizeOfVertex(), &positionDataOffset) var colorDataOffset = (sizeof(Float) * 3) as AnyObject glVertexAttribPointer(self.colorSlot, 4 as GLint, GL_FLOAT.asUnsigned(), GLboolean.convertFromIntegerLiteral(UInt8(GL_FALSE)), VertexDataSource.sizeOfVertex(), VertexDataSource.vertexBufferOffset()) var vertexOffset: Int = 0 glDrawElements(GL_TRIANGLES.asUnsigned(), VertexDataSource.vertexCount(), GL_UNSIGNED_BYTE.asUnsigned(), &vertexOffset)
EDIT #2: Solved
I ended up solving this. The problem in my case was that my conversion of the Objective-C to Swift was incorrect in several cases. For brevity I'll post the final version of the Swift code for the portion I was originally concerned about, but you can view the full source code of the working result here in this example GitHub repo.
The final Swift code:
let positionSlotFirstComponent: CConstVoidPointer = COpaquePointer(UnsafePointer<Int>(0)) glVertexAttribPointer(self.positionSlot, 3 as GLint, GL_FLOAT.asUnsigned(), GLboolean.convertFromIntegerLiteral(UInt8(GL_FALSE)), Int32(sizeof(Vertex)), positionSlotFirstComponent) let colorSlotFirstComponent: CConstVoidPointer = COpaquePointer(UnsafePointer<Int>(sizeof(Float) * 3)) glVertexAttribPointer(self.colorSlot, 4 as GLint, GL_FLOAT.asUnsigned(), GLboolean.convertFromIntegerLiteral(UInt8(GL_FALSE)), Int32(sizeof(Vertex)), colorSlotFirstComponent) let vertextBufferOffset: CConstVoidPointer = COpaquePointer(UnsafePointer<Int>(0)) glDrawElements(GL_TRIANGLES.asUnsigned(), Int32(GLfloat(sizeofValue(Indices)) / GLfloat(sizeofValue(Indices.0))), GL_UNSIGNED_BYTE.asUnsigned(), vertextBufferOffset)
I'm going to go ahead and accept Reto Koradi's answer, as it certainly got me on the right track.