All Downloads are FREE. Search and download functionalities are using the official Maven repository.

org.lwjgl.vulkan.VkBufferMemoryBarrier2KHR Maven / Gradle / Ivy

Go to download

A new generation graphics and compute API that provides high-efficiency, cross-platform access to modern GPUs used in a wide variety of devices from PCs and consoles to mobile phones and embedded platforms.

There is a newer version: 3.3.4
Show newest version
/*
 * Copyright LWJGL. All rights reserved.
 * License terms: https://www.lwjgl.org/license
 * MACHINE GENERATED FILE, DO NOT EDIT
 */
package org.lwjgl.vulkan;

import javax.annotation.*;

import java.nio.*;

import org.lwjgl.*;
import org.lwjgl.system.*;

import static org.lwjgl.system.MemoryUtil.*;
import static org.lwjgl.system.MemoryStack.*;

/**
 * Structure specifying a buffer memory barrier.
 * 
 * 
Description
* *

This structure defines a memory dependency limited to a range of a buffer, and can define a queue family transfer operation for that range.

* *

The first synchronization scope and access scope described by this structure include only operations and memory accesses specified by {@code srcStageMask} and {@code srcAccessMask}.

* *

The second synchronization scope and access scope described by this structure include only operations and memory accesses specified by {@code dstStageMask} and {@code dstAccessMask}.

* *

Both access scopes are limited to only memory accesses to {@code buffer} in the range defined by {@code offset} and {@code size}.

* *

If {@code buffer} was created with {@link VK10#VK_SHARING_MODE_EXCLUSIVE SHARING_MODE_EXCLUSIVE}, and {@code srcQueueFamilyIndex} is not equal to {@code dstQueueFamilyIndex}, this memory barrier defines a queue family transfer operation. When executed on a queue in the family identified by {@code srcQueueFamilyIndex}, this barrier defines a queue family release operation for the specified buffer range, and the second synchronization and access scopes do not synchronize operations on that queue. When executed on a queue in the family identified by {@code dstQueueFamilyIndex}, this barrier defines a queue family acquire operation for the specified buffer range, and the first synchronization and access scopes do not synchronize operations on that queue.

* *

A queue family transfer operation is also defined if the values are not equal, and either is one of the special queue family values reserved for external memory ownership transfers, as described in Queue Family Ownership Transfer. A queue family release operation is defined when {@code dstQueueFamilyIndex} is one of those values, and a queue family acquire operation is defined when {@code srcQueueFamilyIndex} is one of those values.

* *
Valid Usage
* *
    *
  • If the geometry shaders feature is not enabled, {@code srcStageMask} must not contain {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_GEOMETRY_SHADER_BIT_KHR PIPELINE_STAGE_2_GEOMETRY_SHADER_BIT_KHR}
  • *
  • If the tessellation shaders feature is not enabled, {@code srcStageMask} must not contain {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_TESSELLATION_CONTROL_SHADER_BIT_KHR PIPELINE_STAGE_2_TESSELLATION_CONTROL_SHADER_BIT_KHR} or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_TESSELLATION_EVALUATION_SHADER_BIT_KHR PIPELINE_STAGE_2_TESSELLATION_EVALUATION_SHADER_BIT_KHR}
  • *
  • If the conditional rendering feature is not enabled, {@code srcStageMask} must not contain {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_CONDITIONAL_RENDERING_BIT_EXT PIPELINE_STAGE_2_CONDITIONAL_RENDERING_BIT_EXT}
  • *
  • If the fragment density map feature is not enabled, {@code srcStageMask} must not contain {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_FRAGMENT_DENSITY_PROCESS_BIT_EXT PIPELINE_STAGE_2_FRAGMENT_DENSITY_PROCESS_BIT_EXT}
  • *
  • If the transform feedback feature is not enabled, {@code srcStageMask} must not contain {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_TRANSFORM_FEEDBACK_BIT_EXT PIPELINE_STAGE_2_TRANSFORM_FEEDBACK_BIT_EXT}
  • *
  • If the mesh shaders feature is not enabled, {@code srcStageMask} must not contain {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_MESH_SHADER_BIT_NV PIPELINE_STAGE_2_MESH_SHADER_BIT_NV}
  • *
  • If the task shaders feature is not enabled, {@code srcStageMask} must not contain {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_TASK_SHADER_BIT_NV PIPELINE_STAGE_2_TASK_SHADER_BIT_NV}
  • *
  • If the shading rate image feature is not enabled, {@code srcStageMask} must not contain {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_SHADING_RATE_IMAGE_BIT_NV PIPELINE_STAGE_2_SHADING_RATE_IMAGE_BIT_NV}
  • *
  • If the subpass shading feature is not enabled, {@code srcStageMask} must not contain {@link HUAWEISubpassShading#VK_PIPELINE_STAGE_2_SUBPASS_SHADING_BIT_HUAWEI PIPELINE_STAGE_2_SUBPASS_SHADING_BIT_HUAWEI}
  • *
  • If the invocation mask image feature is not enabled, {@code srcStageMask} must not contain {@link HUAWEIInvocationMask#VK_PIPELINE_STAGE_2_INVOCATION_MASK_BIT_HUAWEI PIPELINE_STAGE_2_INVOCATION_MASK_BIT_HUAWEI}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_INDIRECT_COMMAND_READ_BIT_KHR ACCESS_2_INDIRECT_COMMAND_READ_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_DRAW_INDIRECT_BIT_KHR PIPELINE_STAGE_2_DRAW_INDIRECT_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_INDEX_READ_BIT_KHR ACCESS_2_INDEX_READ_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_INDEX_INPUT_BIT_KHR PIPELINE_STAGE_2_INDEX_INPUT_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_VERTEX_INPUT_BIT_KHR PIPELINE_STAGE_2_VERTEX_INPUT_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_VERTEX_ATTRIBUTE_READ_BIT_KHR ACCESS_2_VERTEX_ATTRIBUTE_READ_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_VERTEX_ATTRIBUTE_INPUT_BIT_KHR PIPELINE_STAGE_2_VERTEX_ATTRIBUTE_INPUT_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_VERTEX_INPUT_BIT_KHR PIPELINE_STAGE_2_VERTEX_INPUT_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_INPUT_ATTACHMENT_READ_BIT_KHR ACCESS_2_INPUT_ATTACHMENT_READ_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_FRAGMENT_SHADER_BIT_KHR PIPELINE_STAGE_2_FRAGMENT_SHADER_BIT_KHR}, {@link HUAWEISubpassShading#VK_PIPELINE_STAGE_2_SUBPASS_SHADING_BIT_HUAWEI PIPELINE_STAGE_2_SUBPASS_SHADING_BIT_HUAWEI}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_UNIFORM_READ_BIT_KHR ACCESS_2_UNIFORM_READ_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}, or one of the {@code VK_PIPELINE_STAGE_*_SHADER_BIT} stages
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_SHADER_SAMPLED_READ_BIT_KHR ACCESS_2_SHADER_SAMPLED_READ_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}, or one of the {@code VK_PIPELINE_STAGE_*_SHADER_BIT} stages
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_SHADER_STORAGE_READ_BIT_KHR ACCESS_2_SHADER_STORAGE_READ_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}, or one of the {@code VK_PIPELINE_STAGE_*_SHADER_BIT} stages
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_SHADER_STORAGE_WRITE_BIT_KHR ACCESS_2_SHADER_STORAGE_WRITE_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}, or one of the {@code VK_PIPELINE_STAGE_*_SHADER_BIT} stages
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_SHADER_READ_BIT_KHR ACCESS_2_SHADER_READ_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR}, or one of the {@code VK_PIPELINE_STAGE_*_SHADER_BIT} stages
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_SHADER_WRITE_BIT_KHR ACCESS_2_SHADER_WRITE_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}, or one of the {@code VK_PIPELINE_STAGE_*_SHADER_BIT} stages
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_COLOR_ATTACHMENT_READ_BIT_KHR ACCESS_2_COLOR_ATTACHMENT_READ_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_COLOR_ATTACHMENT_OUTPUT_BIT_KHR PIPELINE_STAGE_2_COLOR_ATTACHMENT_OUTPUT_BIT_KHR} {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_COLOR_ATTACHMENT_WRITE_BIT_KHR ACCESS_2_COLOR_ATTACHMENT_WRITE_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_COLOR_ATTACHMENT_OUTPUT_BIT_KHR PIPELINE_STAGE_2_COLOR_ATTACHMENT_OUTPUT_BIT_KHR} {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_DEPTH_STENCIL_ATTACHMENT_READ_BIT_KHR ACCESS_2_DEPTH_STENCIL_ATTACHMENT_READ_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_EARLY_FRAGMENT_TESTS_BIT_KHR PIPELINE_STAGE_2_EARLY_FRAGMENT_TESTS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_LATE_FRAGMENT_TESTS_BIT_KHR PIPELINE_STAGE_2_LATE_FRAGMENT_TESTS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_DEPTH_STENCIL_ATTACHMENT_WRITE_BIT_KHR ACCESS_2_DEPTH_STENCIL_ATTACHMENT_WRITE_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_EARLY_FRAGMENT_TESTS_BIT_KHR PIPELINE_STAGE_2_EARLY_FRAGMENT_TESTS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_LATE_FRAGMENT_TESTS_BIT_KHR PIPELINE_STAGE_2_LATE_FRAGMENT_TESTS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_TRANSFER_READ_BIT_KHR ACCESS_2_TRANSFER_READ_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_COPY_BIT_KHR PIPELINE_STAGE_2_COPY_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_BLIT_BIT_KHR PIPELINE_STAGE_2_BLIT_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_RESOLVE_BIT_KHR PIPELINE_STAGE_2_RESOLVE_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_TRANSFER_BIT_KHR PIPELINE_STAGE_2_ALL_TRANSFER_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_TRANSFER_WRITE_BIT_KHR ACCESS_2_TRANSFER_WRITE_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_COPY_BIT_KHR PIPELINE_STAGE_2_COPY_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_BLIT_BIT_KHR PIPELINE_STAGE_2_BLIT_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_RESOLVE_BIT_KHR PIPELINE_STAGE_2_RESOLVE_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_CLEAR_BIT_KHR PIPELINE_STAGE_2_CLEAR_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_TRANSFER_BIT_KHR PIPELINE_STAGE_2_ALL_TRANSFER_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_HOST_READ_BIT_KHR ACCESS_2_HOST_READ_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_HOST_BIT_KHR PIPELINE_STAGE_2_HOST_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_HOST_WRITE_BIT_KHR ACCESS_2_HOST_WRITE_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_HOST_BIT_KHR PIPELINE_STAGE_2_HOST_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_CONDITIONAL_RENDERING_READ_BIT_EXT ACCESS_2_CONDITIONAL_RENDERING_READ_BIT_EXT}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_CONDITIONAL_RENDERING_BIT_EXT PIPELINE_STAGE_2_CONDITIONAL_RENDERING_BIT_EXT}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_FRAGMENT_DENSITY_MAP_READ_BIT_EXT ACCESS_2_FRAGMENT_DENSITY_MAP_READ_BIT_EXT}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_FRAGMENT_DENSITY_PROCESS_BIT_EXT PIPELINE_STAGE_2_FRAGMENT_DENSITY_PROCESS_BIT_EXT}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_TRANSFORM_FEEDBACK_WRITE_BIT_EXT ACCESS_2_TRANSFORM_FEEDBACK_WRITE_BIT_EXT}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_TRANSFORM_FEEDBACK_BIT_EXT PIPELINE_STAGE_2_TRANSFORM_FEEDBACK_BIT_EXT}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_TRANSFORM_FEEDBACK_COUNTER_READ_BIT_EXT ACCESS_2_TRANSFORM_FEEDBACK_COUNTER_READ_BIT_EXT}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_DRAW_INDIRECT_BIT_KHR PIPELINE_STAGE_2_DRAW_INDIRECT_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_TRANSFORM_FEEDBACK_BIT_EXT PIPELINE_STAGE_2_TRANSFORM_FEEDBACK_BIT_EXT}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_TRANSFORM_FEEDBACK_COUNTER_WRITE_BIT_EXT ACCESS_2_TRANSFORM_FEEDBACK_COUNTER_WRITE_BIT_EXT}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_TRANSFORM_FEEDBACK_BIT_EXT PIPELINE_STAGE_2_TRANSFORM_FEEDBACK_BIT_EXT}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_SHADING_RATE_IMAGE_READ_BIT_NV ACCESS_2_SHADING_RATE_IMAGE_READ_BIT_NV}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_SHADING_RATE_IMAGE_BIT_NV PIPELINE_STAGE_2_SHADING_RATE_IMAGE_BIT_NV}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link HUAWEIInvocationMask#VK_ACCESS_2_INVOCATION_MASK_READ_BIT_HUAWEI ACCESS_2_INVOCATION_MASK_READ_BIT_HUAWEI}, {@code srcStageMask} must include {@link HUAWEIInvocationMask#VK_PIPELINE_STAGE_2_INVOCATION_MASK_BIT_HUAWEI PIPELINE_STAGE_2_INVOCATION_MASK_BIT_HUAWEI}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_COMMAND_PREPROCESS_READ_BIT_NV ACCESS_2_COMMAND_PREPROCESS_READ_BIT_NV}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_COMMAND_PREPROCESS_BIT_NV PIPELINE_STAGE_2_COMMAND_PREPROCESS_BIT_NV} or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_COMMAND_PREPROCESS_WRITE_BIT_NV ACCESS_2_COMMAND_PREPROCESS_WRITE_BIT_NV}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_COMMAND_PREPROCESS_BIT_NV PIPELINE_STAGE_2_COMMAND_PREPROCESS_BIT_NV} or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_COLOR_ATTACHMENT_READ_NONCOHERENT_BIT_EXT ACCESS_2_COLOR_ATTACHMENT_READ_NONCOHERENT_BIT_EXT}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_COLOR_ATTACHMENT_OUTPUT_BIT_KHR PIPELINE_STAGE_2_COLOR_ATTACHMENT_OUTPUT_BIT_KHR} {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_ACCELERATION_STRUCTURE_READ_BIT_KHR ACCESS_2_ACCELERATION_STRUCTURE_READ_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}, or one of the {@code VK_PIPELINE_STAGE_*_SHADER_BIT} stages
  • *
  • If {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_ACCELERATION_STRUCTURE_WRITE_BIT_KHR ACCESS_2_ACCELERATION_STRUCTURE_WRITE_BIT_KHR}, {@code srcStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR} or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code rayQuery} is not enabled and {@code srcAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_ACCELERATION_STRUCTURE_READ_BIT_KHR ACCESS_2_ACCELERATION_STRUCTURE_READ_BIT_KHR}, {@code srcStageMask} must not include any of the {@code VK_PIPELINE_STAGE_*_SHADER_BIT} stages except {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_RAY_TRACING_SHADER_BIT_KHR PIPELINE_STAGE_2_RAY_TRACING_SHADER_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRVideoDecodeQueue#VK_ACCESS_2_VIDEO_DECODE_READ_BIT_KHR ACCESS_2_VIDEO_DECODE_READ_BIT_KHR}, {@code srcStageMask} must include {@link KHRVideoDecodeQueue#VK_PIPELINE_STAGE_2_VIDEO_DECODE_BIT_KHR PIPELINE_STAGE_2_VIDEO_DECODE_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRVideoDecodeQueue#VK_ACCESS_2_VIDEO_DECODE_WRITE_BIT_KHR ACCESS_2_VIDEO_DECODE_WRITE_BIT_KHR}, {@code srcStageMask} must include {@link KHRVideoDecodeQueue#VK_PIPELINE_STAGE_2_VIDEO_DECODE_BIT_KHR PIPELINE_STAGE_2_VIDEO_DECODE_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRVideoEncodeQueue#VK_ACCESS_2_VIDEO_ENCODE_READ_BIT_KHR ACCESS_2_VIDEO_ENCODE_READ_BIT_KHR}, {@code srcStageMask} must include {@link KHRVideoEncodeQueue#VK_PIPELINE_STAGE_2_VIDEO_ENCODE_BIT_KHR PIPELINE_STAGE_2_VIDEO_ENCODE_BIT_KHR}
  • *
  • If {@code srcAccessMask} includes {@link KHRVideoEncodeQueue#VK_ACCESS_2_VIDEO_ENCODE_WRITE_BIT_KHR ACCESS_2_VIDEO_ENCODE_WRITE_BIT_KHR}, {@code srcStageMask} must include {@link KHRVideoEncodeQueue#VK_PIPELINE_STAGE_2_VIDEO_ENCODE_BIT_KHR PIPELINE_STAGE_2_VIDEO_ENCODE_BIT_KHR}
  • *
* *
    *
  • If the geometry shaders feature is not enabled, {@code dstStageMask} must not contain {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_GEOMETRY_SHADER_BIT_KHR PIPELINE_STAGE_2_GEOMETRY_SHADER_BIT_KHR}
  • *
  • If the tessellation shaders feature is not enabled, {@code dstStageMask} must not contain {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_TESSELLATION_CONTROL_SHADER_BIT_KHR PIPELINE_STAGE_2_TESSELLATION_CONTROL_SHADER_BIT_KHR} or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_TESSELLATION_EVALUATION_SHADER_BIT_KHR PIPELINE_STAGE_2_TESSELLATION_EVALUATION_SHADER_BIT_KHR}
  • *
  • If the conditional rendering feature is not enabled, {@code dstStageMask} must not contain {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_CONDITIONAL_RENDERING_BIT_EXT PIPELINE_STAGE_2_CONDITIONAL_RENDERING_BIT_EXT}
  • *
  • If the fragment density map feature is not enabled, {@code dstStageMask} must not contain {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_FRAGMENT_DENSITY_PROCESS_BIT_EXT PIPELINE_STAGE_2_FRAGMENT_DENSITY_PROCESS_BIT_EXT}
  • *
  • If the transform feedback feature is not enabled, {@code dstStageMask} must not contain {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_TRANSFORM_FEEDBACK_BIT_EXT PIPELINE_STAGE_2_TRANSFORM_FEEDBACK_BIT_EXT}
  • *
  • If the mesh shaders feature is not enabled, {@code dstStageMask} must not contain {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_MESH_SHADER_BIT_NV PIPELINE_STAGE_2_MESH_SHADER_BIT_NV}
  • *
  • If the task shaders feature is not enabled, {@code dstStageMask} must not contain {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_TASK_SHADER_BIT_NV PIPELINE_STAGE_2_TASK_SHADER_BIT_NV}
  • *
  • If the shading rate image feature is not enabled, {@code dstStageMask} must not contain {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_SHADING_RATE_IMAGE_BIT_NV PIPELINE_STAGE_2_SHADING_RATE_IMAGE_BIT_NV}
  • *
  • If the subpass shading feature is not enabled, {@code dstStageMask} must not contain {@link HUAWEISubpassShading#VK_PIPELINE_STAGE_2_SUBPASS_SHADING_BIT_HUAWEI PIPELINE_STAGE_2_SUBPASS_SHADING_BIT_HUAWEI}
  • *
  • If the invocation mask image feature is not enabled, {@code dstStageMask} must not contain {@link HUAWEIInvocationMask#VK_PIPELINE_STAGE_2_INVOCATION_MASK_BIT_HUAWEI PIPELINE_STAGE_2_INVOCATION_MASK_BIT_HUAWEI}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_INDIRECT_COMMAND_READ_BIT_KHR ACCESS_2_INDIRECT_COMMAND_READ_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_DRAW_INDIRECT_BIT_KHR PIPELINE_STAGE_2_DRAW_INDIRECT_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_INDEX_READ_BIT_KHR ACCESS_2_INDEX_READ_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_INDEX_INPUT_BIT_KHR PIPELINE_STAGE_2_INDEX_INPUT_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_VERTEX_INPUT_BIT_KHR PIPELINE_STAGE_2_VERTEX_INPUT_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_VERTEX_ATTRIBUTE_READ_BIT_KHR ACCESS_2_VERTEX_ATTRIBUTE_READ_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_VERTEX_ATTRIBUTE_INPUT_BIT_KHR PIPELINE_STAGE_2_VERTEX_ATTRIBUTE_INPUT_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_VERTEX_INPUT_BIT_KHR PIPELINE_STAGE_2_VERTEX_INPUT_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_INPUT_ATTACHMENT_READ_BIT_KHR ACCESS_2_INPUT_ATTACHMENT_READ_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_FRAGMENT_SHADER_BIT_KHR PIPELINE_STAGE_2_FRAGMENT_SHADER_BIT_KHR}, {@link HUAWEISubpassShading#VK_PIPELINE_STAGE_2_SUBPASS_SHADING_BIT_HUAWEI PIPELINE_STAGE_2_SUBPASS_SHADING_BIT_HUAWEI}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_UNIFORM_READ_BIT_KHR ACCESS_2_UNIFORM_READ_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}, or one of the {@code VK_PIPELINE_STAGE_*_SHADER_BIT} stages
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_SHADER_SAMPLED_READ_BIT_KHR ACCESS_2_SHADER_SAMPLED_READ_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}, or one of the {@code VK_PIPELINE_STAGE_*_SHADER_BIT} stages
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_SHADER_STORAGE_READ_BIT_KHR ACCESS_2_SHADER_STORAGE_READ_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}, or one of the {@code VK_PIPELINE_STAGE_*_SHADER_BIT} stages
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_SHADER_STORAGE_WRITE_BIT_KHR ACCESS_2_SHADER_STORAGE_WRITE_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}, or one of the {@code VK_PIPELINE_STAGE_*_SHADER_BIT} stages
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_SHADER_READ_BIT_KHR ACCESS_2_SHADER_READ_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR}, or one of the {@code VK_PIPELINE_STAGE_*_SHADER_BIT} stages
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_SHADER_WRITE_BIT_KHR ACCESS_2_SHADER_WRITE_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}, or one of the {@code VK_PIPELINE_STAGE_*_SHADER_BIT} stages
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_COLOR_ATTACHMENT_READ_BIT_KHR ACCESS_2_COLOR_ATTACHMENT_READ_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_COLOR_ATTACHMENT_OUTPUT_BIT_KHR PIPELINE_STAGE_2_COLOR_ATTACHMENT_OUTPUT_BIT_KHR} {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_COLOR_ATTACHMENT_WRITE_BIT_KHR ACCESS_2_COLOR_ATTACHMENT_WRITE_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_COLOR_ATTACHMENT_OUTPUT_BIT_KHR PIPELINE_STAGE_2_COLOR_ATTACHMENT_OUTPUT_BIT_KHR} {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_DEPTH_STENCIL_ATTACHMENT_READ_BIT_KHR ACCESS_2_DEPTH_STENCIL_ATTACHMENT_READ_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_EARLY_FRAGMENT_TESTS_BIT_KHR PIPELINE_STAGE_2_EARLY_FRAGMENT_TESTS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_LATE_FRAGMENT_TESTS_BIT_KHR PIPELINE_STAGE_2_LATE_FRAGMENT_TESTS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_DEPTH_STENCIL_ATTACHMENT_WRITE_BIT_KHR ACCESS_2_DEPTH_STENCIL_ATTACHMENT_WRITE_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_EARLY_FRAGMENT_TESTS_BIT_KHR PIPELINE_STAGE_2_EARLY_FRAGMENT_TESTS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_LATE_FRAGMENT_TESTS_BIT_KHR PIPELINE_STAGE_2_LATE_FRAGMENT_TESTS_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_TRANSFER_READ_BIT_KHR ACCESS_2_TRANSFER_READ_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_COPY_BIT_KHR PIPELINE_STAGE_2_COPY_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_BLIT_BIT_KHR PIPELINE_STAGE_2_BLIT_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_RESOLVE_BIT_KHR PIPELINE_STAGE_2_RESOLVE_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_TRANSFER_BIT_KHR PIPELINE_STAGE_2_ALL_TRANSFER_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_TRANSFER_WRITE_BIT_KHR ACCESS_2_TRANSFER_WRITE_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_COPY_BIT_KHR PIPELINE_STAGE_2_COPY_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_BLIT_BIT_KHR PIPELINE_STAGE_2_BLIT_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_RESOLVE_BIT_KHR PIPELINE_STAGE_2_RESOLVE_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_CLEAR_BIT_KHR PIPELINE_STAGE_2_CLEAR_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_TRANSFER_BIT_KHR PIPELINE_STAGE_2_ALL_TRANSFER_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_HOST_READ_BIT_KHR ACCESS_2_HOST_READ_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_HOST_BIT_KHR PIPELINE_STAGE_2_HOST_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_HOST_WRITE_BIT_KHR ACCESS_2_HOST_WRITE_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_HOST_BIT_KHR PIPELINE_STAGE_2_HOST_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_CONDITIONAL_RENDERING_READ_BIT_EXT ACCESS_2_CONDITIONAL_RENDERING_READ_BIT_EXT}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_CONDITIONAL_RENDERING_BIT_EXT PIPELINE_STAGE_2_CONDITIONAL_RENDERING_BIT_EXT}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_FRAGMENT_DENSITY_MAP_READ_BIT_EXT ACCESS_2_FRAGMENT_DENSITY_MAP_READ_BIT_EXT}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_FRAGMENT_DENSITY_PROCESS_BIT_EXT PIPELINE_STAGE_2_FRAGMENT_DENSITY_PROCESS_BIT_EXT}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_TRANSFORM_FEEDBACK_WRITE_BIT_EXT ACCESS_2_TRANSFORM_FEEDBACK_WRITE_BIT_EXT}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_TRANSFORM_FEEDBACK_BIT_EXT PIPELINE_STAGE_2_TRANSFORM_FEEDBACK_BIT_EXT}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_TRANSFORM_FEEDBACK_COUNTER_READ_BIT_EXT ACCESS_2_TRANSFORM_FEEDBACK_COUNTER_READ_BIT_EXT}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_DRAW_INDIRECT_BIT_KHR PIPELINE_STAGE_2_DRAW_INDIRECT_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_TRANSFORM_FEEDBACK_BIT_EXT PIPELINE_STAGE_2_TRANSFORM_FEEDBACK_BIT_EXT}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_TRANSFORM_FEEDBACK_COUNTER_WRITE_BIT_EXT ACCESS_2_TRANSFORM_FEEDBACK_COUNTER_WRITE_BIT_EXT}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_TRANSFORM_FEEDBACK_BIT_EXT PIPELINE_STAGE_2_TRANSFORM_FEEDBACK_BIT_EXT}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_SHADING_RATE_IMAGE_READ_BIT_NV ACCESS_2_SHADING_RATE_IMAGE_READ_BIT_NV}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_SHADING_RATE_IMAGE_BIT_NV PIPELINE_STAGE_2_SHADING_RATE_IMAGE_BIT_NV}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link HUAWEIInvocationMask#VK_ACCESS_2_INVOCATION_MASK_READ_BIT_HUAWEI ACCESS_2_INVOCATION_MASK_READ_BIT_HUAWEI}, {@code dstStageMask} must include {@link HUAWEIInvocationMask#VK_PIPELINE_STAGE_2_INVOCATION_MASK_BIT_HUAWEI PIPELINE_STAGE_2_INVOCATION_MASK_BIT_HUAWEI}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_COMMAND_PREPROCESS_READ_BIT_NV ACCESS_2_COMMAND_PREPROCESS_READ_BIT_NV}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_COMMAND_PREPROCESS_BIT_NV PIPELINE_STAGE_2_COMMAND_PREPROCESS_BIT_NV} or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_COMMAND_PREPROCESS_WRITE_BIT_NV ACCESS_2_COMMAND_PREPROCESS_WRITE_BIT_NV}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_COMMAND_PREPROCESS_BIT_NV PIPELINE_STAGE_2_COMMAND_PREPROCESS_BIT_NV} or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_COLOR_ATTACHMENT_READ_NONCOHERENT_BIT_EXT ACCESS_2_COLOR_ATTACHMENT_READ_NONCOHERENT_BIT_EXT}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_COLOR_ATTACHMENT_OUTPUT_BIT_KHR PIPELINE_STAGE_2_COLOR_ATTACHMENT_OUTPUT_BIT_KHR} {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR PIPELINE_STAGE_2_ALL_GRAPHICS_BIT_KHR}, or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_ACCELERATION_STRUCTURE_READ_BIT_KHR ACCESS_2_ACCELERATION_STRUCTURE_READ_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR}, {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}, or one of the {@code VK_PIPELINE_STAGE_*_SHADER_BIT} stages
  • *
  • If {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_ACCELERATION_STRUCTURE_WRITE_BIT_KHR ACCESS_2_ACCELERATION_STRUCTURE_WRITE_BIT_KHR}, {@code dstStageMask} must include {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR PIPELINE_STAGE_2_ACCELERATION_STRUCTURE_BUILD_BIT_KHR} or {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR PIPELINE_STAGE_2_ALL_COMMANDS_BIT_KHR}
  • *
  • If {@code rayQuery} is not enabled and {@code dstAccessMask} includes {@link KHRSynchronization2#VK_ACCESS_2_ACCELERATION_STRUCTURE_READ_BIT_KHR ACCESS_2_ACCELERATION_STRUCTURE_READ_BIT_KHR}, {@code dstStageMask} must not include any of the {@code VK_PIPELINE_STAGE_*_SHADER_BIT} stages except {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_RAY_TRACING_SHADER_BIT_KHR PIPELINE_STAGE_2_RAY_TRACING_SHADER_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRVideoDecodeQueue#VK_ACCESS_2_VIDEO_DECODE_READ_BIT_KHR ACCESS_2_VIDEO_DECODE_READ_BIT_KHR}, {@code dstStageMask} must include {@link KHRVideoDecodeQueue#VK_PIPELINE_STAGE_2_VIDEO_DECODE_BIT_KHR PIPELINE_STAGE_2_VIDEO_DECODE_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRVideoDecodeQueue#VK_ACCESS_2_VIDEO_DECODE_WRITE_BIT_KHR ACCESS_2_VIDEO_DECODE_WRITE_BIT_KHR}, {@code dstStageMask} must include {@link KHRVideoDecodeQueue#VK_PIPELINE_STAGE_2_VIDEO_DECODE_BIT_KHR PIPELINE_STAGE_2_VIDEO_DECODE_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRVideoEncodeQueue#VK_ACCESS_2_VIDEO_ENCODE_READ_BIT_KHR ACCESS_2_VIDEO_ENCODE_READ_BIT_KHR}, {@code dstStageMask} must include {@link KHRVideoEncodeQueue#VK_PIPELINE_STAGE_2_VIDEO_ENCODE_BIT_KHR PIPELINE_STAGE_2_VIDEO_ENCODE_BIT_KHR}
  • *
  • If {@code dstAccessMask} includes {@link KHRVideoEncodeQueue#VK_ACCESS_2_VIDEO_ENCODE_WRITE_BIT_KHR ACCESS_2_VIDEO_ENCODE_WRITE_BIT_KHR}, {@code dstStageMask} must include {@link KHRVideoEncodeQueue#VK_PIPELINE_STAGE_2_VIDEO_ENCODE_BIT_KHR PIPELINE_STAGE_2_VIDEO_ENCODE_BIT_KHR}
  • *
* *
    *
  • {@code offset} must be less than the size of {@code buffer}
  • *
  • If {@code size} is not equal to {@link VK10#VK_WHOLE_SIZE WHOLE_SIZE}, {@code size} must be greater than 0
  • *
  • If {@code size} is not equal to {@link VK10#VK_WHOLE_SIZE WHOLE_SIZE}, {@code size} must be less than or equal to than the size of {@code buffer} minus {@code offset}
  • *
  • If {@code buffer} is non-sparse then it must be bound completely and contiguously to a single {@code VkDeviceMemory} object
  • *
  • If {@code srcQueueFamilyIndex} is not equal to {@code dstQueueFamilyIndex}, at least one must not be a special queue family reserved for external memory ownership transfers, as described in Queue Family Ownership Transfer
  • *
  • If {@code buffer} was created with a sharing mode of {@link VK10#VK_SHARING_MODE_CONCURRENT SHARING_MODE_CONCURRENT}, {@code srcQueueFamilyIndex} and {@code dstQueueFamilyIndex} are not equal, and one of {@code srcQueueFamilyIndex} and {@code dstQueueFamilyIndex} is one of the special queue family values reserved for external memory transfers, the other must be {@link VK10#VK_QUEUE_FAMILY_IGNORED QUEUE_FAMILY_IGNORED}
  • *
  • If {@code buffer} was created with a sharing mode of {@link VK10#VK_SHARING_MODE_EXCLUSIVE SHARING_MODE_EXCLUSIVE}, and {@code srcQueueFamilyIndex} and {@code dstQueueFamilyIndex} are not equal, {@code srcQueueFamilyIndex} and {@code dstQueueFamilyIndex} must both be valid queue families, or one of the special queue family values reserved for external memory transfers, as described in Queue Family Ownership Transfer
  • *
  • If either {@code srcStageMask} or {@code dstStageMask} includes {@link KHRSynchronization2#VK_PIPELINE_STAGE_2_HOST_BIT_KHR PIPELINE_STAGE_2_HOST_BIT_KHR}, {@code srcQueueFamilyIndex} and {@code dstQueueFamilyIndex} must be equal
  • *
* *
Valid Usage (Implicit)
* *
    *
  • {@code sType} must be {@link KHRSynchronization2#VK_STRUCTURE_TYPE_BUFFER_MEMORY_BARRIER_2_KHR STRUCTURE_TYPE_BUFFER_MEMORY_BARRIER_2_KHR}
  • *
  • {@code pNext} must be {@code NULL}
  • *
  • {@code srcStageMask} must be a valid combination of {@code VkPipelineStageFlagBits2KHR} values
  • *
  • {@code srcAccessMask} must be a valid combination of {@code VkAccessFlagBits2KHR} values
  • *
  • {@code dstStageMask} must be a valid combination of {@code VkPipelineStageFlagBits2KHR} values
  • *
  • {@code dstAccessMask} must be a valid combination of {@code VkAccessFlagBits2KHR} values
  • *
  • {@code buffer} must be a valid {@code VkBuffer} handle
  • *
* *
See Also
* *

{@link VkDependencyInfoKHR}

* *

Layout

* *

 * struct VkBufferMemoryBarrier2KHR {
 *     VkStructureType {@link #sType};
 *     void const * {@link #pNext};
 *     VkPipelineStageFlags2KHR {@link #srcStageMask};
 *     VkAccessFlags2KHR {@link #srcAccessMask};
 *     VkPipelineStageFlags2KHR {@link #dstStageMask};
 *     VkAccessFlags2KHR {@link #dstAccessMask};
 *     uint32_t {@link #srcQueueFamilyIndex};
 *     uint32_t {@link #dstQueueFamilyIndex};
 *     VkBuffer {@link #buffer};
 *     VkDeviceSize {@link #offset};
 *     VkDeviceSize {@link #size};
 * }
*/ public class VkBufferMemoryBarrier2KHR extends Struct implements NativeResource { /** The struct size in bytes. */ public static final int SIZEOF; /** The struct alignment in bytes. */ public static final int ALIGNOF; /** The struct member offsets. */ public static final int STYPE, PNEXT, SRCSTAGEMASK, SRCACCESSMASK, DSTSTAGEMASK, DSTACCESSMASK, SRCQUEUEFAMILYINDEX, DSTQUEUEFAMILYINDEX, BUFFER, OFFSET, SIZE; static { Layout layout = __struct( __member(4), __member(POINTER_SIZE), __member(8), __member(8), __member(8), __member(8), __member(4), __member(4), __member(8), __member(8), __member(8) ); SIZEOF = layout.getSize(); ALIGNOF = layout.getAlignment(); STYPE = layout.offsetof(0); PNEXT = layout.offsetof(1); SRCSTAGEMASK = layout.offsetof(2); SRCACCESSMASK = layout.offsetof(3); DSTSTAGEMASK = layout.offsetof(4); DSTACCESSMASK = layout.offsetof(5); SRCQUEUEFAMILYINDEX = layout.offsetof(6); DSTQUEUEFAMILYINDEX = layout.offsetof(7); BUFFER = layout.offsetof(8); OFFSET = layout.offsetof(9); SIZE = layout.offsetof(10); } /** * Creates a {@code VkBufferMemoryBarrier2KHR} instance at the current position of the specified {@link ByteBuffer} container. Changes to the buffer's content will be * visible to the struct instance and vice versa. * *

The created instance holds a strong reference to the container object.

*/ public VkBufferMemoryBarrier2KHR(ByteBuffer container) { super(memAddress(container), __checkContainer(container, SIZEOF)); } @Override public int sizeof() { return SIZEOF; } /** the type of this structure. */ @NativeType("VkStructureType") public int sType() { return nsType(address()); } /** {@code NULL} or a pointer to a structure extending this structure. */ @NativeType("void const *") public long pNext() { return npNext(address()); } /** a {@code VkPipelineStageFlags2KHR} mask of pipeline stages to be included in the first synchronization scope. */ @NativeType("VkPipelineStageFlags2KHR") public long srcStageMask() { return nsrcStageMask(address()); } /** a {@code VkAccessFlags2KHR} mask of access flags to be included in the first access scope. */ @NativeType("VkAccessFlags2KHR") public long srcAccessMask() { return nsrcAccessMask(address()); } /** a {@code VkPipelineStageFlags2KHR} mask of pipeline stages to be included in the second synchronization scope. */ @NativeType("VkPipelineStageFlags2KHR") public long dstStageMask() { return ndstStageMask(address()); } /** a {@code VkAccessFlags2KHR} mask of access flags to be included in the second access scope. */ @NativeType("VkAccessFlags2KHR") public long dstAccessMask() { return ndstAccessMask(address()); } /** the source queue family for a queue family ownership transfer. */ @NativeType("uint32_t") public int srcQueueFamilyIndex() { return nsrcQueueFamilyIndex(address()); } /** the destination queue family for a queue family ownership transfer. */ @NativeType("uint32_t") public int dstQueueFamilyIndex() { return ndstQueueFamilyIndex(address()); } /** a handle to the buffer whose backing memory is affected by the barrier. */ @NativeType("VkBuffer") public long buffer() { return nbuffer(address()); } /** an offset in bytes into the backing memory for {@code buffer}; this is relative to the base offset as bound to the buffer (see {@link VK10#vkBindBufferMemory BindBufferMemory}). */ @NativeType("VkDeviceSize") public long offset() { return noffset(address()); } /** a size in bytes of the affected area of backing memory for {@code buffer}, or {@link VK10#VK_WHOLE_SIZE WHOLE_SIZE} to use the range from {@code offset} to the end of the buffer. */ @NativeType("VkDeviceSize") public long size() { return nsize(address()); } /** Sets the specified value to the {@link #sType} field. */ public VkBufferMemoryBarrier2KHR sType(@NativeType("VkStructureType") int value) { nsType(address(), value); return this; } /** Sets the {@link KHRSynchronization2#VK_STRUCTURE_TYPE_BUFFER_MEMORY_BARRIER_2_KHR STRUCTURE_TYPE_BUFFER_MEMORY_BARRIER_2_KHR} value to the {@link #sType} field. */ public VkBufferMemoryBarrier2KHR sType$Default() { return sType(KHRSynchronization2.VK_STRUCTURE_TYPE_BUFFER_MEMORY_BARRIER_2_KHR); } /** Sets the specified value to the {@link #pNext} field. */ public VkBufferMemoryBarrier2KHR pNext(@NativeType("void const *") long value) { npNext(address(), value); return this; } /** Sets the specified value to the {@link #srcStageMask} field. */ public VkBufferMemoryBarrier2KHR srcStageMask(@NativeType("VkPipelineStageFlags2KHR") long value) { nsrcStageMask(address(), value); return this; } /** Sets the specified value to the {@link #srcAccessMask} field. */ public VkBufferMemoryBarrier2KHR srcAccessMask(@NativeType("VkAccessFlags2KHR") long value) { nsrcAccessMask(address(), value); return this; } /** Sets the specified value to the {@link #dstStageMask} field. */ public VkBufferMemoryBarrier2KHR dstStageMask(@NativeType("VkPipelineStageFlags2KHR") long value) { ndstStageMask(address(), value); return this; } /** Sets the specified value to the {@link #dstAccessMask} field. */ public VkBufferMemoryBarrier2KHR dstAccessMask(@NativeType("VkAccessFlags2KHR") long value) { ndstAccessMask(address(), value); return this; } /** Sets the specified value to the {@link #srcQueueFamilyIndex} field. */ public VkBufferMemoryBarrier2KHR srcQueueFamilyIndex(@NativeType("uint32_t") int value) { nsrcQueueFamilyIndex(address(), value); return this; } /** Sets the specified value to the {@link #dstQueueFamilyIndex} field. */ public VkBufferMemoryBarrier2KHR dstQueueFamilyIndex(@NativeType("uint32_t") int value) { ndstQueueFamilyIndex(address(), value); return this; } /** Sets the specified value to the {@link #buffer} field. */ public VkBufferMemoryBarrier2KHR buffer(@NativeType("VkBuffer") long value) { nbuffer(address(), value); return this; } /** Sets the specified value to the {@link #offset} field. */ public VkBufferMemoryBarrier2KHR offset(@NativeType("VkDeviceSize") long value) { noffset(address(), value); return this; } /** Sets the specified value to the {@link #size} field. */ public VkBufferMemoryBarrier2KHR size(@NativeType("VkDeviceSize") long value) { nsize(address(), value); return this; } /** Initializes this struct with the specified values. */ public VkBufferMemoryBarrier2KHR set( int sType, long pNext, long srcStageMask, long srcAccessMask, long dstStageMask, long dstAccessMask, int srcQueueFamilyIndex, int dstQueueFamilyIndex, long buffer, long offset, long size ) { sType(sType); pNext(pNext); srcStageMask(srcStageMask); srcAccessMask(srcAccessMask); dstStageMask(dstStageMask); dstAccessMask(dstAccessMask); srcQueueFamilyIndex(srcQueueFamilyIndex); dstQueueFamilyIndex(dstQueueFamilyIndex); buffer(buffer); offset(offset); size(size); return this; } /** * Copies the specified struct data to this struct. * * @param src the source struct * * @return this struct */ public VkBufferMemoryBarrier2KHR set(VkBufferMemoryBarrier2KHR src) { memCopy(src.address(), address(), SIZEOF); return this; } // ----------------------------------- /** Returns a new {@code VkBufferMemoryBarrier2KHR} instance allocated with {@link MemoryUtil#memAlloc memAlloc}. The instance must be explicitly freed. */ public static VkBufferMemoryBarrier2KHR malloc() { return wrap(VkBufferMemoryBarrier2KHR.class, nmemAllocChecked(SIZEOF)); } /** Returns a new {@code VkBufferMemoryBarrier2KHR} instance allocated with {@link MemoryUtil#memCalloc memCalloc}. The instance must be explicitly freed. */ public static VkBufferMemoryBarrier2KHR calloc() { return wrap(VkBufferMemoryBarrier2KHR.class, nmemCallocChecked(1, SIZEOF)); } /** Returns a new {@code VkBufferMemoryBarrier2KHR} instance allocated with {@link BufferUtils}. */ public static VkBufferMemoryBarrier2KHR create() { ByteBuffer container = BufferUtils.createByteBuffer(SIZEOF); return wrap(VkBufferMemoryBarrier2KHR.class, memAddress(container), container); } /** Returns a new {@code VkBufferMemoryBarrier2KHR} instance for the specified memory address. */ public static VkBufferMemoryBarrier2KHR create(long address) { return wrap(VkBufferMemoryBarrier2KHR.class, address); } /** Like {@link #create(long) create}, but returns {@code null} if {@code address} is {@code NULL}. */ @Nullable public static VkBufferMemoryBarrier2KHR createSafe(long address) { return address == NULL ? null : wrap(VkBufferMemoryBarrier2KHR.class, address); } /** * Returns a new {@link VkBufferMemoryBarrier2KHR.Buffer} instance allocated with {@link MemoryUtil#memAlloc memAlloc}. The instance must be explicitly freed. * * @param capacity the buffer capacity */ public static VkBufferMemoryBarrier2KHR.Buffer malloc(int capacity) { return wrap(Buffer.class, nmemAllocChecked(__checkMalloc(capacity, SIZEOF)), capacity); } /** * Returns a new {@link VkBufferMemoryBarrier2KHR.Buffer} instance allocated with {@link MemoryUtil#memCalloc memCalloc}. The instance must be explicitly freed. * * @param capacity the buffer capacity */ public static VkBufferMemoryBarrier2KHR.Buffer calloc(int capacity) { return wrap(Buffer.class, nmemCallocChecked(capacity, SIZEOF), capacity); } /** * Returns a new {@link VkBufferMemoryBarrier2KHR.Buffer} instance allocated with {@link BufferUtils}. * * @param capacity the buffer capacity */ public static VkBufferMemoryBarrier2KHR.Buffer create(int capacity) { ByteBuffer container = __create(capacity, SIZEOF); return wrap(Buffer.class, memAddress(container), capacity, container); } /** * Create a {@link VkBufferMemoryBarrier2KHR.Buffer} instance at the specified memory. * * @param address the memory address * @param capacity the buffer capacity */ public static VkBufferMemoryBarrier2KHR.Buffer create(long address, int capacity) { return wrap(Buffer.class, address, capacity); } /** Like {@link #create(long, int) create}, but returns {@code null} if {@code address} is {@code NULL}. */ @Nullable public static VkBufferMemoryBarrier2KHR.Buffer createSafe(long address, int capacity) { return address == NULL ? null : wrap(Buffer.class, address, capacity); } /** * Returns a new {@code VkBufferMemoryBarrier2KHR} instance allocated on the specified {@link MemoryStack}. * * @param stack the stack from which to allocate */ public static VkBufferMemoryBarrier2KHR malloc(MemoryStack stack) { return wrap(VkBufferMemoryBarrier2KHR.class, stack.nmalloc(ALIGNOF, SIZEOF)); } /** * Returns a new {@code VkBufferMemoryBarrier2KHR} instance allocated on the specified {@link MemoryStack} and initializes all its bits to zero. * * @param stack the stack from which to allocate */ public static VkBufferMemoryBarrier2KHR calloc(MemoryStack stack) { return wrap(VkBufferMemoryBarrier2KHR.class, stack.ncalloc(ALIGNOF, 1, SIZEOF)); } /** * Returns a new {@link VkBufferMemoryBarrier2KHR.Buffer} instance allocated on the specified {@link MemoryStack}. * * @param stack the stack from which to allocate * @param capacity the buffer capacity */ public static VkBufferMemoryBarrier2KHR.Buffer malloc(int capacity, MemoryStack stack) { return wrap(Buffer.class, stack.nmalloc(ALIGNOF, capacity * SIZEOF), capacity); } /** * Returns a new {@link VkBufferMemoryBarrier2KHR.Buffer} instance allocated on the specified {@link MemoryStack} and initializes all its bits to zero. * * @param stack the stack from which to allocate * @param capacity the buffer capacity */ public static VkBufferMemoryBarrier2KHR.Buffer calloc(int capacity, MemoryStack stack) { return wrap(Buffer.class, stack.ncalloc(ALIGNOF, capacity, SIZEOF), capacity); } // ----------------------------------- /** Unsafe version of {@link #sType}. */ public static int nsType(long struct) { return UNSAFE.getInt(null, struct + VkBufferMemoryBarrier2KHR.STYPE); } /** Unsafe version of {@link #pNext}. */ public static long npNext(long struct) { return memGetAddress(struct + VkBufferMemoryBarrier2KHR.PNEXT); } /** Unsafe version of {@link #srcStageMask}. */ public static long nsrcStageMask(long struct) { return UNSAFE.getLong(null, struct + VkBufferMemoryBarrier2KHR.SRCSTAGEMASK); } /** Unsafe version of {@link #srcAccessMask}. */ public static long nsrcAccessMask(long struct) { return UNSAFE.getLong(null, struct + VkBufferMemoryBarrier2KHR.SRCACCESSMASK); } /** Unsafe version of {@link #dstStageMask}. */ public static long ndstStageMask(long struct) { return UNSAFE.getLong(null, struct + VkBufferMemoryBarrier2KHR.DSTSTAGEMASK); } /** Unsafe version of {@link #dstAccessMask}. */ public static long ndstAccessMask(long struct) { return UNSAFE.getLong(null, struct + VkBufferMemoryBarrier2KHR.DSTACCESSMASK); } /** Unsafe version of {@link #srcQueueFamilyIndex}. */ public static int nsrcQueueFamilyIndex(long struct) { return UNSAFE.getInt(null, struct + VkBufferMemoryBarrier2KHR.SRCQUEUEFAMILYINDEX); } /** Unsafe version of {@link #dstQueueFamilyIndex}. */ public static int ndstQueueFamilyIndex(long struct) { return UNSAFE.getInt(null, struct + VkBufferMemoryBarrier2KHR.DSTQUEUEFAMILYINDEX); } /** Unsafe version of {@link #buffer}. */ public static long nbuffer(long struct) { return UNSAFE.getLong(null, struct + VkBufferMemoryBarrier2KHR.BUFFER); } /** Unsafe version of {@link #offset}. */ public static long noffset(long struct) { return UNSAFE.getLong(null, struct + VkBufferMemoryBarrier2KHR.OFFSET); } /** Unsafe version of {@link #size}. */ public static long nsize(long struct) { return UNSAFE.getLong(null, struct + VkBufferMemoryBarrier2KHR.SIZE); } /** Unsafe version of {@link #sType(int) sType}. */ public static void nsType(long struct, int value) { UNSAFE.putInt(null, struct + VkBufferMemoryBarrier2KHR.STYPE, value); } /** Unsafe version of {@link #pNext(long) pNext}. */ public static void npNext(long struct, long value) { memPutAddress(struct + VkBufferMemoryBarrier2KHR.PNEXT, value); } /** Unsafe version of {@link #srcStageMask(long) srcStageMask}. */ public static void nsrcStageMask(long struct, long value) { UNSAFE.putLong(null, struct + VkBufferMemoryBarrier2KHR.SRCSTAGEMASK, value); } /** Unsafe version of {@link #srcAccessMask(long) srcAccessMask}. */ public static void nsrcAccessMask(long struct, long value) { UNSAFE.putLong(null, struct + VkBufferMemoryBarrier2KHR.SRCACCESSMASK, value); } /** Unsafe version of {@link #dstStageMask(long) dstStageMask}. */ public static void ndstStageMask(long struct, long value) { UNSAFE.putLong(null, struct + VkBufferMemoryBarrier2KHR.DSTSTAGEMASK, value); } /** Unsafe version of {@link #dstAccessMask(long) dstAccessMask}. */ public static void ndstAccessMask(long struct, long value) { UNSAFE.putLong(null, struct + VkBufferMemoryBarrier2KHR.DSTACCESSMASK, value); } /** Unsafe version of {@link #srcQueueFamilyIndex(int) srcQueueFamilyIndex}. */ public static void nsrcQueueFamilyIndex(long struct, int value) { UNSAFE.putInt(null, struct + VkBufferMemoryBarrier2KHR.SRCQUEUEFAMILYINDEX, value); } /** Unsafe version of {@link #dstQueueFamilyIndex(int) dstQueueFamilyIndex}. */ public static void ndstQueueFamilyIndex(long struct, int value) { UNSAFE.putInt(null, struct + VkBufferMemoryBarrier2KHR.DSTQUEUEFAMILYINDEX, value); } /** Unsafe version of {@link #buffer(long) buffer}. */ public static void nbuffer(long struct, long value) { UNSAFE.putLong(null, struct + VkBufferMemoryBarrier2KHR.BUFFER, value); } /** Unsafe version of {@link #offset(long) offset}. */ public static void noffset(long struct, long value) { UNSAFE.putLong(null, struct + VkBufferMemoryBarrier2KHR.OFFSET, value); } /** Unsafe version of {@link #size(long) size}. */ public static void nsize(long struct, long value) { UNSAFE.putLong(null, struct + VkBufferMemoryBarrier2KHR.SIZE, value); } // ----------------------------------- /** An array of {@link VkBufferMemoryBarrier2KHR} structs. */ public static class Buffer extends StructBuffer implements NativeResource { private static final VkBufferMemoryBarrier2KHR ELEMENT_FACTORY = VkBufferMemoryBarrier2KHR.create(-1L); /** * Creates a new {@code VkBufferMemoryBarrier2KHR.Buffer} instance backed by the specified container. * * Changes to the container's content will be visible to the struct buffer instance and vice versa. The two buffers' position, limit, and mark values * will be independent. The new buffer's position will be zero, its capacity and its limit will be the number of bytes remaining in this buffer divided * by {@link VkBufferMemoryBarrier2KHR#SIZEOF}, and its mark will be undefined. * *

The created buffer instance holds a strong reference to the container object.

*/ public Buffer(ByteBuffer container) { super(container, container.remaining() / SIZEOF); } public Buffer(long address, int cap) { super(address, null, -1, 0, cap, cap); } Buffer(long address, @Nullable ByteBuffer container, int mark, int pos, int lim, int cap) { super(address, container, mark, pos, lim, cap); } @Override protected Buffer self() { return this; } @Override protected VkBufferMemoryBarrier2KHR getElementFactory() { return ELEMENT_FACTORY; } /** @return the value of the {@link VkBufferMemoryBarrier2KHR#sType} field. */ @NativeType("VkStructureType") public int sType() { return VkBufferMemoryBarrier2KHR.nsType(address()); } /** @return the value of the {@link VkBufferMemoryBarrier2KHR#pNext} field. */ @NativeType("void const *") public long pNext() { return VkBufferMemoryBarrier2KHR.npNext(address()); } /** @return the value of the {@link VkBufferMemoryBarrier2KHR#srcStageMask} field. */ @NativeType("VkPipelineStageFlags2KHR") public long srcStageMask() { return VkBufferMemoryBarrier2KHR.nsrcStageMask(address()); } /** @return the value of the {@link VkBufferMemoryBarrier2KHR#srcAccessMask} field. */ @NativeType("VkAccessFlags2KHR") public long srcAccessMask() { return VkBufferMemoryBarrier2KHR.nsrcAccessMask(address()); } /** @return the value of the {@link VkBufferMemoryBarrier2KHR#dstStageMask} field. */ @NativeType("VkPipelineStageFlags2KHR") public long dstStageMask() { return VkBufferMemoryBarrier2KHR.ndstStageMask(address()); } /** @return the value of the {@link VkBufferMemoryBarrier2KHR#dstAccessMask} field. */ @NativeType("VkAccessFlags2KHR") public long dstAccessMask() { return VkBufferMemoryBarrier2KHR.ndstAccessMask(address()); } /** @return the value of the {@link VkBufferMemoryBarrier2KHR#srcQueueFamilyIndex} field. */ @NativeType("uint32_t") public int srcQueueFamilyIndex() { return VkBufferMemoryBarrier2KHR.nsrcQueueFamilyIndex(address()); } /** @return the value of the {@link VkBufferMemoryBarrier2KHR#dstQueueFamilyIndex} field. */ @NativeType("uint32_t") public int dstQueueFamilyIndex() { return VkBufferMemoryBarrier2KHR.ndstQueueFamilyIndex(address()); } /** @return the value of the {@link VkBufferMemoryBarrier2KHR#buffer} field. */ @NativeType("VkBuffer") public long buffer() { return VkBufferMemoryBarrier2KHR.nbuffer(address()); } /** @return the value of the {@link VkBufferMemoryBarrier2KHR#offset} field. */ @NativeType("VkDeviceSize") public long offset() { return VkBufferMemoryBarrier2KHR.noffset(address()); } /** @return the value of the {@link VkBufferMemoryBarrier2KHR#size} field. */ @NativeType("VkDeviceSize") public long size() { return VkBufferMemoryBarrier2KHR.nsize(address()); } /** Sets the specified value to the {@link VkBufferMemoryBarrier2KHR#sType} field. */ public VkBufferMemoryBarrier2KHR.Buffer sType(@NativeType("VkStructureType") int value) { VkBufferMemoryBarrier2KHR.nsType(address(), value); return this; } /** Sets the {@link KHRSynchronization2#VK_STRUCTURE_TYPE_BUFFER_MEMORY_BARRIER_2_KHR STRUCTURE_TYPE_BUFFER_MEMORY_BARRIER_2_KHR} value to the {@link VkBufferMemoryBarrier2KHR#sType} field. */ public VkBufferMemoryBarrier2KHR.Buffer sType$Default() { return sType(KHRSynchronization2.VK_STRUCTURE_TYPE_BUFFER_MEMORY_BARRIER_2_KHR); } /** Sets the specified value to the {@link VkBufferMemoryBarrier2KHR#pNext} field. */ public VkBufferMemoryBarrier2KHR.Buffer pNext(@NativeType("void const *") long value) { VkBufferMemoryBarrier2KHR.npNext(address(), value); return this; } /** Sets the specified value to the {@link VkBufferMemoryBarrier2KHR#srcStageMask} field. */ public VkBufferMemoryBarrier2KHR.Buffer srcStageMask(@NativeType("VkPipelineStageFlags2KHR") long value) { VkBufferMemoryBarrier2KHR.nsrcStageMask(address(), value); return this; } /** Sets the specified value to the {@link VkBufferMemoryBarrier2KHR#srcAccessMask} field. */ public VkBufferMemoryBarrier2KHR.Buffer srcAccessMask(@NativeType("VkAccessFlags2KHR") long value) { VkBufferMemoryBarrier2KHR.nsrcAccessMask(address(), value); return this; } /** Sets the specified value to the {@link VkBufferMemoryBarrier2KHR#dstStageMask} field. */ public VkBufferMemoryBarrier2KHR.Buffer dstStageMask(@NativeType("VkPipelineStageFlags2KHR") long value) { VkBufferMemoryBarrier2KHR.ndstStageMask(address(), value); return this; } /** Sets the specified value to the {@link VkBufferMemoryBarrier2KHR#dstAccessMask} field. */ public VkBufferMemoryBarrier2KHR.Buffer dstAccessMask(@NativeType("VkAccessFlags2KHR") long value) { VkBufferMemoryBarrier2KHR.ndstAccessMask(address(), value); return this; } /** Sets the specified value to the {@link VkBufferMemoryBarrier2KHR#srcQueueFamilyIndex} field. */ public VkBufferMemoryBarrier2KHR.Buffer srcQueueFamilyIndex(@NativeType("uint32_t") int value) { VkBufferMemoryBarrier2KHR.nsrcQueueFamilyIndex(address(), value); return this; } /** Sets the specified value to the {@link VkBufferMemoryBarrier2KHR#dstQueueFamilyIndex} field. */ public VkBufferMemoryBarrier2KHR.Buffer dstQueueFamilyIndex(@NativeType("uint32_t") int value) { VkBufferMemoryBarrier2KHR.ndstQueueFamilyIndex(address(), value); return this; } /** Sets the specified value to the {@link VkBufferMemoryBarrier2KHR#buffer} field. */ public VkBufferMemoryBarrier2KHR.Buffer buffer(@NativeType("VkBuffer") long value) { VkBufferMemoryBarrier2KHR.nbuffer(address(), value); return this; } /** Sets the specified value to the {@link VkBufferMemoryBarrier2KHR#offset} field. */ public VkBufferMemoryBarrier2KHR.Buffer offset(@NativeType("VkDeviceSize") long value) { VkBufferMemoryBarrier2KHR.noffset(address(), value); return this; } /** Sets the specified value to the {@link VkBufferMemoryBarrier2KHR#size} field. */ public VkBufferMemoryBarrier2KHR.Buffer size(@NativeType("VkDeviceSize") long value) { VkBufferMemoryBarrier2KHR.nsize(address(), value); return this; } } }




© 2015 - 2024 Weber Informatics LLC | Privacy Policy