sprite-kit

In SpriteKit does touchesBegan run in the same thread as the SKScene update method?

时光怂恿深爱的人放手 提交于 2019-12-05 04:18:23
In the Apple documentation here Advanced Scene Processing it describes the update method and how a scene is rendered, but it does not mention when input is processed. It is not clear if this is in in the same thread as the rendering loop, or whether it is concurrent with it. If I have an object that I update from both the SKScene update method and the touchesBegan method (in this case of a SKSpriteNode ) do I have to worry about synchronising the two accesses to my object? So after a few days with no answer I set up some experiments. By the way, these tests are run on the simulator and not on

Create Button in SpriteKit: Swift

大憨熊 提交于 2019-12-05 03:59:17
I want to create a button in SpriteKit or in an SKScene that sends the view to another view controller. I tried using the "performSegue with identifier ", however apparently an SKScene doesn't support this. How would I create a button that sends the view to another view with SpriteKit ? This is the code that I've tried using to perform this action. The line with "HomeButton.prepareForSegueWithIdentifier()" is just an example. It won't actually let me add the "prepareForSegue" part, it doesn't support it <--- What I mean by that is when I go to add it, it is unrecognized. class GameOverScene:

Memory Leak for .showsPhysics

冷暖自知 提交于 2019-12-05 03:57:50
I have just recently spent the past 5 hours trying to debug a memory leak in my Spritekit App. After app Launch, I noticed a small climb in my memory usage. I spent 3 of those 5 hours digging through reference material, learning about strong VS Weak with ARC (Definitely Recommend Reading up on that for Intermediates Such as myself) Is anyone else experiencing this issue? If so is there any sort of explanation? Here is a small snippet of my GameViewController: class GameViewController: UIViewController { override func viewDidLoad() { super.viewDidLoad() if let scene = MainMenu(fileNamed:

SpriteKit's Update Function: time vs. framerate

风格不统一 提交于 2019-12-05 03:55:12
I'm new to programming and Spritekit in general, and am interested in exploring the relationship between milliseconds & framerate, and how the update function is used as an intermediary between both. Framerate vs. milliseconds Essentially, the main difference between framerate and time is that time is always consistant, while framerate is not (it could dip due to intensive graphics procedures). However, time is usually checked and set during SKScene's update event (which is called every frame), so I'm trying to figure out how time is correctly calculated, when you don't know how many frames

SKVideoNode as texture for SCNSphere

≡放荡痞女 提交于 2019-12-05 03:45:11
i'm trying to use a SKVideoNode as a video texture source for a SCNSphere within my SCNView I'm following this answer: SKVideoNode (embedded in SKScene) as texture for for Scene Kit Node not working And with my code (pasted at the end of the question) I do get a video and an audio playing. Issue is, the mapping only occurs on a quarter of the sphere (the all xy-positive quarter). The cameraNode is inside (0,0,0) the sphere and independent of the sphereNode. I do apply a scale to the sphere node, just to reverse the nodes of the texture: sphereNode.scale = SCNVector3Make(-1, 1, 1) but

SpriteKit SKScene missing touchesEnded

China☆狼群 提交于 2019-12-05 03:36:18
I've noticed that touchesEnded don't always get delivered to an SKScene on multi touch. Depending on speed of removing fingers etc, I would permanently miss some of the touchesEnded. touchesCancelled is implemented and I added a custom UIView and put over the left side of the screen - no problems. I made a custom SKView and captured events - again no problem. It's obvious that SKScene doesn't get all the touchesEnded the SKView it's embedded in, but why? (BTW, I'm running the SKScene completely without any nodes) EDIT: Some further investigation reveals I can get SKScene to lose a touch

iOS - Different images based on device OR scaling the same image?

半城伤御伤魂 提交于 2019-12-05 03:34:50
It seems that developers always create different image assets for different devices and load them based on the device. But are there any drawbacks to just creating images for the largest resolution device (iPad) and then scaling that image down for iPhone 6, 5, etc? I use SpriteKit, so I would just create SKSpriteNodes of different sizes and apply the same texture to them and scale it to fit the node. I know that performance can be something to consider (loading large images for older devices). But that aside, is there anything else? Would these images appear more pixelated/blurry? The perfect

iOS10 - can't render Sprite Kit scene within SceneKit with openGL

白昼怎懂夜的黑 提交于 2019-12-05 03:32:05
Since I've updated to iOS 10, I'm not able anymore to render a Sprite Kit scene to a Scene Node while using openGL for rendering. Things work fine with Metal. The error logs: "Failed to create IOSurface image (texture)" I used to be able to do something like: class ViewController: UIViewController { @IBOutlet weak var scnView: SCNView! override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view, typically from a nib. scnView.showsStatistics = true scnView.allowsCameraControl = true let scnScene = SCNScene() scnView.scene = scnScene print("scnView

SpriteKit: performance hit while preloading SKTextureAtlas

醉酒当歌 提交于 2019-12-05 03:02:21
I'm experiencing a performance hit when preloading SKTextureAtlas : let textureAtlas = SKTextureAtlas(named: atlasName) textureAtlas.preload(completionHandler: { ... }) By performance hit, I mean FPS dropping to ~50 for a short amounts of time. I tested it with Time Profiler in Instruments and verified that this work is indeed being done on a worker thread, like stated in documentation . The image bellow shows a Time Profiler capture of the spike, caused by preloading atlas. As you can see, most of the spike is caused by 2 worker threads, which all seem to be loading image data, as far as I

ARKit / SpriteKit - set pixelBufferAttributes to SKVideoNode or make transparent pixels in video (chroma-key effect) another way

末鹿安然 提交于 2019-12-05 02:59:53
My goal is to present 2D animated characters in the real environment using ARKit . The animated characters are part of a video at presented in the following snapshot from the video: Displaying the video itself was achieved with no problem at all using the code: func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? { guard let urlString = Bundle.main.path(forResource: "resourceName", ofType: "mp4") else { return nil } let url = URL(fileURLWithPath: urlString) let asset = AVAsset(url: url) let item = AVPlayerItem(asset: asset) let player = AVPlayer(playerItem: item) let videoNode =