스트리밍은 대부분의 브라우저와
Developer 앱에서 사용할 수 있습니다.
-
Advancements in Game Controllers
Let's rumble! Discover how you can bring third-party game controllers and custom haptics into your games on iPhone, iPad, Mac, and Apple TV. We'll show you how to add support for the latest controllers — including Xbox's Elite Wireless Controller Series 2 and Adaptive Controller — and map your game's controls accordingly. Learn how you can use the Game Controller framework in tandem with Core Haptics to enable rumble feedback. And find out how you can take your gaming experience to the next level with custom button mapping, nonstandard inputs, and control over specialty features like motion sensors, lights, and battery level. To get the most out of this session, you should be familiar with the Game Controller framework. Check the documentation link for a primer. And if you build games for iPad, be sure to check out "Bring keyboard and mouse gaming to iPad” for a guide on integrating keyboard, mouse, and trackpad inputs into your experience.
리소스
관련 비디오
WWDC23
WWDC22
WWDC21
- Qualities of great iPad and iPhone apps on Macs with M1
- Tap into virtual and physical game controllers
WWDC20
WWDC19
-
다운로드
♪ Hello and welcome to WDDC.
James Kelly: Hello. My name is James Kelly, and I'm a member of the Game Technologies Engineering team at Apple.
In this video, we'll be talking about new game controller features and changes coming to our platforms this year, and we'll give some advice on how you can best adopt these new changes.
As a reminder, the game controller framework has a goal to make it easy to add support for MFI select third-party game controllers to your games on iOS, tvOS and MacOS.
By abstracting controller hardware through a common API, the game controller framework lets you write your code once without you having to worry about how that controller data is mapped.
This year we're making sure you can take advantage of, and customize, the great features of game controllers like haptics and rumble, motion, lights, as well as unique inputs like touchpads and paddles.
We'll also show how users can remap their controllers globally or tune it to their game to get just the configuration they want.
Before we start, we're excited to announce some additions to our supported lineup of great controllers.
First up, we've added support for the Xbox Elite Wireless Controller Series 2.
It's got a really great feel and it's been highly requested by our users.
We're also really excited to announce support for the Xbox Adaptive Controller.
This unique extensible controller helps to make gaming more accessible for all.
So developers are creating some great games with controller support.
Oceanhorn 2, Sonic Racing, Spyder, and Lego Brawls are some excellent examples on Apple Arcade and just a few of the games on the App Store that support controllers.
By using the game controller framework, these games automatically support newly-added controllers like the Xbox Adaptive Controller or the Xbox Elite Wireless Controller Series 2.
So now let's talk about changes coming to the game controller framework this year in support of these new controllers.
So let's start by talking about how you can access the buttons of these new controllers.
There's a new way to access controller input this year.
GC PhysicalInputProfile, which represents the collection of all physical inputs available on a controller -- meaning its buttons, triggers, Dpads, thumbsticks, and so on.
We're calling this the new Extensible Input API as it allows your game to dynamically query and support all of the controller's inputs at runtime -- even non-standard inputs like the DualShock 4's touchpad or the Xbox Elite Controller's paddle buttons.
Every controller now has a physical input profile.
And so GC ExtendedGamepad and GC MicroGamepad have both been made subclasses of this GC PhysicalInputProfile.
You're encouraged to still check for the presence of profiles like GC ExtendedGamepad to filter out controllers that don't have the buttons you need.
You can then use the PhysicalInputProfile to access the controller's unique buttons like the DualShock 4's touchpad.
So let's take a look at a code example to get a sense of these changes.
Let's say we want to add some special shortcuts for controllers with extra buttons.
First, we'll add an action that can instantly trigger an attack combo which would normally require a combination of button presses.
We'll also track the buttons on the controller that we have and have not mapped to actions.
Now let's look at setupConnectedController that is called when a controller connects.
We grab a reference to the physical input profile on the controller.
Remember that every controller will have this profile.
We have a helper function, setupBasicControls, that will set up all the standard controls for the face buttons, the left and right thumbsticks, the Dpad, et cetera, and it will also update mapped buttons.
Using that physical input profile, we can then easily grab a reference to a panel button, if one exists, and update our mapped button set.
Finally, we can look at all of the buttons available on the physical input profile and filter out any buttons that have already been mapped.
We can use this to populate an in-game mapping UI with a list of unused buttons.
So for ease of use, we've added two new subclasses to GC ExtendedGamepad: GC DualShockGamepad and GC XboxGamepad.
The DualShockGamepad exposes the touchpad surface with two-finger tracking as well as the touchpad's button.
The XboxGamepad exposes the four paddle buttons of the Xbox Elite Wireless Controller.
Use extra buttons to augment your games with additional convenience controls, but avoid requiring these buttons as other controllers don't have them.
So one quick note about the paddle buttons on the Xbox Elite Wireless Controller Series 2.
The controller supports hardware button remapping, which can be configured and then stored in one of three profiles.
The selected profile is indicated by the three LEDs on the front of the controller.
And due to how the controller sends data, the paddle buttons will only work when the controller does not have a button mapping profile selected, meaning that the three LEDs are unlit.
So up next, let's talk about haptics and rumble, one of the most exciting additions to the API this year.
Games coordinate visuals, sounds, and haptics from the controller to more deeply immerse the player.
So great haptics can help give a sense that your car is driving over gravel, that your character is walking through sand, that you feel the recoil in your trigger finger, or even that a nearby explosion is shaking your room.
Many of our supported controllers can provide haptic feedback, including the DualShock 4, the Xbox Wireless Controller, and the Xbox Elite Wireless Controller.
And that feedback is programmable, meaning that you have fine-grained control over the low-level haptic feeling that is played.
These controllers all use different hardware for generating haptics.
So let's talk a little bit about how you can easily add haptic feedback to your game without having to worry about the physical details of each controller.
So this year we're happy to add game controller support to Core Haptics, on top of the existing phone haptic support.
Core Haptics is a powerful API that allows you to compose and play haptic patterns to customize your app's haptic feedback.
In Core Haptics, you can craft custom haptic patterns from basic building blocks called haptic events.
These patterns can be created programmatically or stored in Apple haptic and audio pattern, or AHAP, files.
After designing your haptic patterns, you simply create pattern players from a haptic engine to play your haptic content.
Core Haptics and AHAP allows you to design your haptic content once and then play it everywhere, be it a phone, a DualShock 4, or any one of our other supported devices.
For more information on the details of the Core Haptics framework, including some excellent examples on how to design AHAP content, please watch Introducing Core Haptics from WWDC 2019.
Let's look at a simple practical example.
Haptic feedback is a great way to immerse players in a game and let them really feel the impact of their gameplay.
So let's say we have a player and an enemy in an action game.
When the enemy attacks the player, they're dealt damage and the controller rumbles.
So how do we achieve this in our game using the game controller and Core Haptics frameworks? Well first, we need to design our haptic content.
Here we've programmatically created a haptic pattern called enemy strike that will play when an enemy attacks.
We can then feed this data into a CHHapticPattern.
Once we have our haptic content designed and loaded into a pattern, we can get ready to play it.
But how do we play it on a controller? Well first we need a GCController instance.
Here representing the Xbox Elite Controller shown on the right.
Next, we can create a CHHapticEngine that targets the locations on a controller that we'd like to play haptics on.
A haptic engine is just an object that manages your app's requests to play haptic patterns.
Finally, we can create a CHHapticPatternPlayer from our engine and our pattern.
Once we call start on this pattern player, our AHAP content will play on our controller.
In this case, just the handles.
You can even layer these pattern players.
The system will automatically combine the haptics from all active patterns for you at runtime, allowing you to create some really dynamic effects.
So how do you create a haptic engine? Simply call createEngineWithLocality on a controller instance and pass in a GC haptic locality.
This locality defines the physical locations on the controller that the haptic engine will target.
The default haptic locality is geared towards providing a haptic experience that your users would expect.
For example, on game controllers, the default locality typically targets the handles.
You can get more specific if you'd like.
Here we're targeting just the left impulse trigger.
So how do we put all this together? Let's look at a code example.
We begin by creating an instance of a CH haptic engine.
One that targets the handles of the active controller.
The engine is assigned to our member variable so we can keep it around.
After creating the haptic engine, we'll need to start it and check for possible errors.
The engine will continue to run until the application or possible outside action stops it.
So let's play some haptics.
Here's the function that executes when the player is dealt damage by the enemy.
In this example, you want to generate a haptic pattern that scales with the amount of damage that was dealt.
So we have to create the pattern player and its pattern at the moment they're needed to account for this damage.
Once the pattern player is created, we start it at time CHHapticTimeImmediate, which indicates that we want to play it with minimal latency.
If you had two separate haptic players that you wanted to synchronize, you would need to specify an actual time here.
Notice that the app does not hold onto the instance of the player.
It's pattern is guaranteed to continue playing until it's finished, so we can simply fire and forget it.
Let's look at creating a pattern player.
This method is responsible for creating the custom haptic content that will play when we take damage.
We'll create a continuous haptic event that applies over a specified duration.
We'll give it a sharpness and intensity.
This will provide a nice baseline of haptic feedback, regardless of how much damage was done to the player.
Next we create a transient haptic event.
These are brief impulses that occur at a specific point in time such as the click you would feel from toggling a switch.
Note that the transient events don't have a specified duration.
They're just the shortest haptic event that you can produce, so the actual time a transient takes will vary from controller to controller.
Again, we set the sharpness and intensity.
This time we scale the intensity by the amount of damage that was dealt to the player.
Next, we create the pattern containing these two events.
Finally, we create the pattern player from this pattern and return it.
OK. So now that you saw how easy it was to generate haptic content in Core Haptics, let's talk about how you can transition your existing rumble code from other platforms.
In many existing games, haptics are tied directly to the update loop.
This typically means that every update, the engine will set the intensity of each physical actuator on a game controller directly.
With this approach, the engine combines different haptic effects each frame.
Let's talk about how you can migrate a game that uses this architecture to Core Haptics.
Our goal is to update the intensities of the controller's motors each frame.
To achieve this, we'll need to create a pattern player so we can send changes to it and then we'll create a long-running pattern with an initial haptic intensity set to one.
In our update loop, we'll just change the intensity of the motors.
To do this, we create a dynamic parameter with the haptic intensity we calculate for a given motor and then we send it to the pattern player.
We can do this each frame, and as long as our updates are fast enough, the user will feel a continuous change in haptic intensity.
Let's take a quick look at how we can achieve this in code.
Here we're setting up all the pieces you need to be able to update your controller's motors in your update loop.
We're setting up our pattern player, which will later send new intensities to each update.
In this function, we'll create a long-running, continuous haptic event with a haptic intensity set to the max value of one.
We'll use the continuous event to create a haptic pattern.
And finally, we'll create our pattern player and start it immediately.
You'll need to hold onto this, as you'll be using it later.
Now in our update loop, we're also going to update our haptics.
This is where we update the intensity of our motors.
We create a dynamic parameter targeting haptic intensity.
These are similar to the parameters used to generate events, except these could be applied dynamically to an already-running pattern, and they apply to the entire pattern, not just a single event.
Here, haptic engine motor intensity represents the intensity of the motor from 0 to 1 as calculated by the engine for this frame.
This will act as a multiplier on the pattern player's intensity.
We finish up by sending our dynamic intensity parameter to our pattern player.
Please note that this example only covers one CH haptic engine.
You will need to repeat this approach for each motor if you would like to control them independently from one another.
Next up, let's talk about some other features that we've added to the Game Controller framework this year that'll help you to fully utilize the extended capabilities of newly-supported controllers.
Let's start by looking at a situation that single-player games frequently need to handle.
Your game launches, and you see multiple controllers connected.
Particularly with the Apple TV, it's common to see the Siri Remote and a game controller -- but which one should you take input from? How do you know which controller is currently being used by the player? You could register your controller connection and disconnection handlers and manage the lifecycles of the controllers, but you still need to track when a controller becomes active to update the UI and controls as the user switches between input devices.
This is why we're introducing some new API to handle this for you.
We've added the current property to GCController which always returns the most recently-used controller or nil if no controller is connected.
We've also added two new notifications that you can observe to properly respond to the current controller changing.
GCControllerDidBecomeCurrent and GCControllerDidStopBeingCurrent.
Single-player games should adapt to the currently active game controller by using this property.
By listening to the notifications, you can easily adapt your UI to reflect the currently used controller.
Next, let's talk about the motion sensors on the DualShock 4.
The DualShock 4 has a gyroscope, meaning that it knows its rotation rate in 3D space allowing for some great gameplay possibilities.
One common technique is to use the gyroscope for fine-tuned camera aiming with the thumbsticks used for more coarse aiming.
The DualShock 4 also has an accelerometer.
There are many ways to use accelerometer.
For example, you can detect whether the controller is tilted to the left or the right for driving controls in a racing game or you can interpret high, rapid shifts in acceleration as a shake gesture to make your player character attack.
We've introduced support for the DualShock 4's motion sensors via GCMotion, a property on GCController.
For those of you familiar with supporting the Siri Remote on tvOS, we're using the same API here but with a few tweaks.
When supporting the DualShock 4, you need to manually turn the motion sensors on and off to preserve battery.
You can query whether you need to manage the sensors yourself or leave it up to the system like with the Siri Remote.
Set motion.sensorsActive to toggle motion sensors on and off.
Additionally, some controllers like the DualShock 4 do not separate gravity from user acceleration and only report that total acceleration of the controller.
Query whether the connected controller has the ability to separate gravity from user acceleration, and if it doesn't, use the total acceleration instead.
The next feature I want to tell you about is the DualShock 4's lightbar.
It's usually used as a player indicator but you can also use it for game effects.
For example, it could flash a different color when the player walks into lava or shine green, then yellow, then red as the player loses health.
We've surfaced support for the DualShock 4's lightbar via GCDeviceLight, a property on GCController.
It's easy to change the color of a DualShock 4 lightbar.
Just set the GCDeviceLight color value.
Here, we're setting the color to red.
Nowadays, many controllers are wireless and may need charging.
We're making battery state available to you so that you can show it in your UI, particularly in low battery situations.
This is provided via GCDeviceBattery, a property on GCController.
You can check the level of the battery as well as its charging state, whether it's charging, discharging, or fully charged.
You can then use key value observing (or KVO) to monitor any changes to the battery.
Now I'd like to introduce my colleague Hannah Gillis, who will give some tips on how to best take advantage of our new input remapping feature.
Hannah Gillis: Thanks, James.
Let's talk about how you can indicate that your app supports game controllers and the benefits that come along when you do so.
There are a few reasons to indicate controller support.
First, it surfaces information on the App Store and on Apple TV to communicate to users when a controller is required for an application.
A game controller badge will be listed on the app's product page to visually declare this info.
Second, indicating controller support provides the ability for input remapping in your application, which I'll discuss shortly.
For these reasons, we highly recommend you update your apps for the best user experience.
To indicate controller support, you should enable the game controllers capability in Xcode.
This informs the system that your app uses game controller features.
Once enabled, you can take advantage of many associated benefits.
As mentioned, one of these benefits is game controller input remapping.
And I'm excited tell you about this feature in both iOS and tvOS.
Have you ever played a game and wished the buttons were configured differently? For example, maybe you're in an open world.
You have a controller in your hand and want your character to look up.
What direction do you push the joystick? Well, depending on your preference, you might actually want to invert the Y-axis if it better matches your intuition.
This is called input remapping and it's incredibly powerful for allowing users to customize their gaming experience.
Additionally, it empowers our accessibility community to enjoy gaming the way they want.
In iOS 14 and tvOS 14, we have created a method to support input remapping, even if it's not built directly into your app.
There are two ways you can take advantage of input remapping.
The first is globally, meaning any button customizations will apply to all game controller apps on your device.
The second is per application.
It allows you to specify different remapping for an individual app.
Both of these options are available if you've indicated game controller support in Xcode.
Let's take a look at how a user will experience input remapping.
In iOS 14, start by connecting one of our supported game controllers via Bluetooth.
Go to Settings and tap General, where you will now see game controller.
Tap Game Controller to see your currently connected device, then tap Customizations to choose the button you want to customize.
I want to remap the R1 button so I'll select that.
Let's remap this to the L1 button.
Now if you go back, you can see the R1 button is remapped to the L1 button.
Remember our joystick example? You can invert the joystick axis here as well.
Simply click Left Stick and then toggle Invert Vertically.
Now let's say you want to add custom remapping for just one specific app.
You can do this by selecting Add App and choosing one of your game controller supported apps to remap as needed.
These customizations will only apply to the app you have selected rather than all game controller apps you might have.
We have input remapping available in tvOS 14 as well.
It can be found under Settings > Remotes and Devices > Bluetooth.
Then simply select your connected controller for customization.
So now that you know how global and per app input remapping work for the user, how can you adopt this in your game? To support global remapping, you simply need to use the game controller framework we reviewed earlier.
This ensures the benefit of input remapping across all apps on the device.
For per application remapping, you should additionally check the box called Extended Gamepad in the Game Controller section of Xcode.
This allows users to apply unique customizations to your app specifically.
If a user has applied input remapping, you can make sure your in-game UI and tutorials reflect this, along with other controller scenarios your player might encounter.
To enable this, we have added game controller input glyphs to SF Symbols in iOS 14.
You no longer need to generate art assets on your own, and Apple keeps this library of glyphs accurate for all controllers that are supported today and in the future.
Like all SF Symbols, you can choose different weights for your game controller glyphs.
And to match the look and feel of your game, you can choose Light and Dark Mode glyphs as well as tinted colors to go with your design.
To learn more, check out Introducing SF Symbols from WWDC 2019.
To find the correct glyph that matches the player's input, you can look at the sfSymbolsName property of the GCController element.
This works for the button, axis, joystick, trigger, or shoulder button.
Here, if I want to display the corresponding glyph for the classic Y button, it will return the Y in a circle glyph if the input device is an MFI or Xbox controller or the triangle in a circle glyph if the input device is a DualShock 4.
And of course we made sure that the UI accurately reflects any global or per application button remapping your player has chosen to do.
For example, say the player remapped the Y button to the X button.
When you reference the Y button in your game, it will display the X in a circle glyph in the UI.
Your game and your UI are easily responsive to input remapping decisions like these.
And that wraps our section on input remapping and UI glyphs.
Now I'll pass it back to James.
James: Thanks, Hannah.
OK, so let's summarize what we've discussed in this video.
We talked about how the extensible input API can keep your controller support flexible to any new controllers that may be added.
We went over game controller and Core Haptics integration that will allow you to create rich haptic and rumble experiences on supported controllers.
We dove into how you can support some of the more unique features of controllers like motion, lightbar, and battery.
We also went over the new input remapping available on iOS and tvOS and what that means for your game.
We talked about new controller button glyphs that have been added to SF Symbols and how you can use them to create a polished and adaptive UI.
And finally, we showed how updating your app's capabilities in Xcode to indicate extended gamepad support will badge your game in the App Store, allow per app input remapping, and futureproof your game for any new features that rely on this capability.
For more information about this year's game controller update, please visit the Developer site for the game controller framework.
If you're interested in learning about adding keyboard and mouse support to your game, watch this year's talk on Bring Keyboard and Mouse Gaming to iPadOS.
That's it for this year's game controllers update.
Thank you for watching.
We hope you found this information helpful and that you use it to create some awesome games with game controller support.
-
-
2:53 - Extensible input API
// Extensible input API example var attackComboBtn: GCControllerButtonInput? var mapBtn: GCControllerButtonInput? var mappedButtons = Set<GCControllerButtonInput>() var unmappedButtons = Set<GCControllerButtonInput>() func setupConnectedController(_ controller: GCController) { let input = controller.physicalInputProfile // Set up standard button mapping setupBasicControls(input) // Map a shortcut to the player's special combo attack attackComboBtn = input.buttons["Paddle 1"] if (attackComboBtn != nil) { mappedButtons.insert(attackComboBtn!) } // Map a shortcut to the in-game map mapBtn = input.buttons[GCInputDualShockTouchpadButton] if (mapBtn != nil) { mappedButtons.insert(mapBtn!) } // Find buttons that havent' been mapped to any actions yet unmappedButtons = input.allButtons.filter { !mappedButtons.contains($0) } }
-
8:45 - Starting the Haptic Engine
private func createAndStartHapticEngine() { // Create and configure a haptic engine for the active controller guard let controller = activeController else { return } hapticEngine = controller.haptics?.createEngine(withLocality: .handles) guard let engine = hapticEngine else { print("Active controller does not support handle haptics") return }
-
9:05 - Play haptics
// Play haptics whenever the player is damaged private func playerWasDamaged(damage: Float) { do { // Calculate the magnitude of damage as percentage of range [0, maxPossibleDamage] let damageMagnitude: Float = ... // Create a haptic pattern player for the player being hit by an enemy let hapticPlayer = try patternPlayerForPlayerDamage(damageMagnitude) // Start player, "fire and forget". try hapticPlayer?.start(atTime: CHHapticTimeImmediate) } catch let error { print("Haptic Playback Error: \(error)") } }
-
9:49 - Creating a haptic pattern
// Create a haptic pattern that scales to the damage dealt to the player private func patternPlayerForPlayerDamage(_ damage: Float) throws -> CHHapticPatternPlayer? { let continuousEvent = CHHapticEvent(eventType: .hapticContinuous, parameters: [ CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.5), CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.3), ], relativeTime: 0, duration: 0.6) let firstTransientEvent = CHHapticEvent(eventType: .hapticTransient, parameters: [ CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.5), CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.9 * damage), ], relativeTime: 0.2) let secondTransientEvent = CHHapticEvent(eventType: .hapticTransient, parameters: [ CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.5), CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.9 * damage), ], relativeTime: 0.4) let pattern = try CHHapticPattern(events: [continuousEvent, firstTransientEvent, secondTransientEvent], parameters: []) return try engine.makePlayer(with: pattern) }
-
12:28 - Updating haptics every frame
// Update the state of the connected controller's haptics every frame private func update() { updateHaptics() } private func updateHaptics() { // Update the controller's haptics by sending a dynamic intensity parameter each frame do { // Create dynamic parameter for the intensity. let intensityParam = CHHapticDynamicParameter(parameterID: .hapticIntensityControl, value: hapticEngineMotorIntensity, relativeTime: 0) // Send parameter to the pattern player. try hapticsUpdateLoopPatternPlayer?.sendParameters([intensityParam], atTime: CHHapticTimeImmediate) } catch let error { print("Dynamic Parameter Error: \(error)") } }
-
14:11 - Current controller notifications
NSNotification.Name.GCControllerDidBecomeCurrent NSNotification.Name.GCControllerDidStopBeingCurrentNotification
-
15:25 - Manual activation
if motion.sensorsRequireManualActivation { motion.sensorsActive = true }
-
15:44 - Using total acceleration
if motion.hasGravityAndUserAcceleration { handleMotion(gravity: motion.gravity, userAccel: motion.userAcceleration) } else { handleMotion(totalAccel: motion.acceleration) }
-
16:27 - Setting up the lightbar
guard let controller = GCController.current else { return } controller.light?.color = GCColor.init(red: 1.0, green: 0, blue: 0)
-
22:36 - Input glyphs with SF Symbols
let xboxButtonY = xboxController.physicalInputProfile[GCInputButtonY]! guard let xboxSfSymbolsName = xboxButtonY.sfSymbolsName else { return } let xboxButtonYGlyph = UIImage(systemName: xboxSfSymbolsName) let ds4ButtonY = ds4Controller.physicalInputProfile[GCInputButtonY]! guard let ds4SfSymbolsName = ds4ButtonY.sfSymbolsName else { return } let ds4ButtonYGlyph = UIImage(systemName: ds4SfSymbolsName)
-
-
찾고 계신 콘텐츠가 있나요? 위에 주제를 입력하고 원하는 내용을 바로 검색해 보세요.
쿼리를 제출하는 중에 오류가 발생했습니다. 인터넷 연결을 확인하고 다시 시도해 주세요.