• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Is Mantle even a good idea?

Underseer

Contributor
Joined
May 29, 2003
Messages
11,413
Location
Chicago suburbs
Basic Beliefs
atheism, resistentialism
http://en.wikipedia.org/wiki/Mantle_(API)

The thing that worries me about Mantle is that it shifts responsibility for optimization more towards the game developers than the drivers. While I can understand why developers want this level of control, what happens to Mantle-based games after the game developers stop producing patches for the game? That effectively means they'll stop dealing with the pain in the ass of doing separate optimizations for multiple hardware configurations.

Doesn't that mean that as time goes on, old Mantle games could become poorly optimized or even break on whatever hardware future computers will be using?
 
Direct3D is going the same route: lower-level API so that PC games can be optimised as much as console games.

Any loss of optimisation on future hardware, due to changes in architecture or whatever, may probably be offset by gains in brute processing power.

I also noted that there may be an OpenGL implementation based on Mantle, unless AMD releases Mantle for Linux. (Not holding my breath.)
 
Apple has something similar for iOS: Metal (Apple's developer docs) I've found a slideshow for Mantle itself: Mantle - Introducing a new API for Graphics - AMD at GDC14 -- it covers some of the setup code that you'd write for Mantle.

OpenGL 4.5 released, next-gen OpenGL unveiled: Cross-platform Mantle killer, DX12 competitor | ExtremeTech
OpenGL in 2014 — Tom Dalling

The next generation of OpenGL, codenamed glNext, seems like it will be much like Mantle, Metal, and DX12. Here is a presentation on it. On PDF page 74 is a list of participants:

Pixar, Oculus VR, Sony, Epic Games, Blizzard, Apple, Unity, Valve, Electronic Arts, Qualcomm, AMD, Hi Corp, RTT, MediaTek, Samsung, Intel, Transgaming, Nvidia, ARM, Broadcom, Mobica, Imagination, Vivante

No Microsoft. But it's interesting that Apple and AMD are on board, alongside Android developers, video-card developers, and some major 3D game and graphics companies.
 
Carmack on OpenGL is his legendary 1996 review of Direct3D.
The overriding reason why GL is so much better than D3D has to do with ease of use. GL is easy to use and fun to experiment with. D3D is not (ahem). You can make sample GL programs with a single page of code. I think D3D has managed to make the worst possible interface choice at every oportunity. COM. Expandable structs passed to functions. Execute buffers. Some of these choices were made so that the API would be able to gracefully expand in the future, but who cares about having an API that can grow if you have forced it to be painful to use now and forever after? Many things that are a single line of GL code require half a page of D3D code to allocate a structure, set a size, fill something in, call a COM routine, then extract the result.

iOS 8 Metal Tutorial with Swift: Getting Started - Ray Wenderlich -- a *lot* of code to write, but it is all API calls rather than filling in complicated structs. The various buffers and other data structures are maintained behind the scenes by the Metal runtime. From what I've found about Mantle, it seems rather similar. glNext will likely look similar, and if so, then someone will have to write an "EZ glNext" library for it, with lots of sensible but overridable defaults.
 
WWDC 2014 Session Videos - Apple Developer has "Working with Metal: Fundamentals", and one can download a video or its slides as a PDF if one can't view an inlined streaming version. It goes through a graphics version of "Hello, World": drawing a triangle. Note that Metal uses Objective-C, like Cocoa.


Init steps:

Create an object which refers to the video card.
Create a command queue for the video-card object
Create a vertex buffer for the video-card object and stuff it with the vertex data

Create a render-pipeline object
Add vertex and fragment (pixel) shader functions to it
Give it the framebuffer's pixel format
Compile it

In the shader functions, one can give hints to their compiler to indicate what args and struct members refer to -- [[what]]

Create a pane to display the results -- it's officially called a view, but I like to call it a pane, because that's a part of a window


Drawing steps:

Create a command buffer for the command queue
Get a drawable from the display-pane object
Create a render-pass object
Give it the drawable's contents, the action when loading, and the color for clearing it

Create a render-command object
Give it the render-pass object, the render-pipeline object, the vertex-buffer object, and a draw command (drawPrimitives)
Commit it


From John Carmack on early Direct3D:

OpenGL
Code:
glBegin (GL_TRIANGLES);
glVertex (0,0,0);
glVertex (1,1,0);
glVertex (2,0,0);
glEnd ();

Direct3D
Code:
(psuedo code, and incomplete)
v = &buffer.vertexes[0];
v->x = 0; v->y = 0; v->z = 0;
v++;
v->x = 1; v->y = 1; v->z = 0;
v++;
v->x = 2; v->y = 0; v->z = 0;
c = &buffer.commands;
c->operation = DRAW_TRIANGLE;
c->vertexes[0] = 0;
c->vertexes[1] = 1;
c->vertexes[2] = 2;
IssueExecuteBuffer (buffer);

More recent OpenGL would use something like
Code:
float Vertices = {0,0,0, 1,1,0, 2,0,0};
glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(3, GL_FLOAT, 0, Vertices);

glDrawArrays(GL_TRIANGLES, 0, 3);
Even more recent OpenGL would let you put your vertex data into a vertex buffer, though it would still use glDrawArrays(). OpenGL also can accept a list of indices of vertices to draw: glDrawElements().

Recent Direct3D has a command nearly identical to glDrawArrays(): DrawPrimitive()
 
Some history. I recall from somewhere how Microsoft attempted to promote Direct3D in the consumer space by hurting OpenGL. This was from the late 1990's.

For Windows NT, all that was necessary to support OpenGL was to write some driver that draws triangles in display coordinates. But for Windows 95/98, it was necessary to write a much bigger driver, like transforming vertices from model coordinates to display coordinates. Since Win9x could have used WinNT's OpenGL code for doing that, that seems like obstructionism.

But an issue with OpenGL on consumer-grade machines back then was that OpenGL is a rather large standard. However, when John Carmack decided to use OpenGL, he used a subset of it, a sort of "miniGL".

In the late 1990's, M$ teamed up with Silicon Graphics to try to unify OpenGL and Direct3D as Fahrenheit. But that effort did not get very far, and M$ mainly learned how to rip off OpenGL, it seems.


Turning to Apple, it had a rather interesting history with 3D API's. In the early 1990's, it developed QuickDraw 3D, which came in two parts, a high-level and a low-level part. The high-level part is a scene-graph API, where one creates 3D models and lights and cameras and the like. A 3D game engine has a specialized sort of scene graph. Apple eventually dropped that, and Quesa is an open-source, cross-platform imitation of it. The low-level part is RAVE, the Rendering Acceleration Virtual Engine. It was a sort of OpenGL Lite, for drawing triangles in display coordinates. Apple dropped that one also in the late 1990's, committing to OpenGL.
 
OpenGL has gotten a boost from Apple in recent years, from Apple using it in iOS. Likewise, Google uses it in Android. On game consoles, it looks like the Playstation 3's PSGL is more-or-less a version of it. Sony being in on the glNext project suggests that Sony may support it for the PS4 and subsequent generations.
 
Thanks for all the info, Lpetrich. Those are exactly the kind of answers I was looking for.

So what's your overall take on Mantle vs OpenGL vs Direct3D/DirectX?
 
So what's your overall take on Mantle vs OpenGL vs Direct3D/DirectX?
I think that Mantle and Metal are going to be transitional API's, and that they will probably be phased out as glNext gains acceptance. I think it significant that the developers of those two API's, AMD and Apple, are involved in the glNext effort.

 
So what's your overall take on Mantle vs OpenGL vs Direct3D/DirectX?
Now for Direct3D. Microsoft has long preferred Direct3D to OpenGL, and I think that it mainly supports OpenGL because a lot of professional 3D graphics software continues to use it, like professional CAD software.

Roughly from 1995 to 2005, OpenGL had been rather vulnerable. Mass-market desktop PeeCees with fancy video cards ate into the market for specialized graphics workstations. M$ was also promoting Direct3D rather heavily for 3D-game development. In that time, John Carmack convinced Apple's management that it should dump RAVE and adopt OpenGL. That made OpenGL run on a lot of mass-market hardware and get used for 3D-game development and porting.

But since 2005, smartphones started getting 3D-graphics capability, and Apple and Google naturally supported OpenGL for them. The same happened with tablets with related OSes.

Game consoles were not much involved, since access to their 3D capabilities was often through rather low-level API's. Bungie decided on M$'s Xbox for Halo because it was easier to write for Direct3D than that. By then, Direct3D was much improved over what John Carmack had legendarily reviewed. However, Sony is known to support a modification of OpenGL for the Playstation 3, though I haven't found anything on OpenGL and the Playstation 4.


While M$ continues to dominate desktop and laptop computer OSes, it does not dominate the smartphone, tablet, or game-console markets. Though M$ has done well with its Xbox series of game consoles, it has failed to get much of the market for smartphones or tablets. So Direct3D risks getting marginalized because of being M$-only. Desktop computers and Xboxes will likely keep it going, but some M$ executive might decide that it's not worth the trouble.
 
glNext now has a name: Vulkan with a k, from the German word for volcano: Vulkan - Graphics and compute belong together, Khronos Reveals Vulkan API for High-efficiency Graphics and Compute on GPUs - Khronos Group Press Release. It was inspired by Mantle, and from descriptions of it, it works much like Metal. Unlike OpenGL proper, it does not have any global state, but instead, objects for GPU's, render pipelines, command queues, command buffers, etc. Several threads can be assigned command buffers to fill, and another thread can submit them to a command queue.

Vulkan accepts shaders in an intermediate language called SPIR-V, sort of like Java bytecode: SPIR - The first open standard intermediate language for parallel compute and graphics. It consists of opcodes with varying numbers of operands, with the opcodes containing how many operands. This makes it easy to skip through SPIR-V code. Parsing SPIR-V is much easier than parsing some source language like GLSL, making drivers much simpler. OpenGL drivers currently accept shaders as GLSL source.

Another user of SPIR-V is OpenCL 2.1: OpenCL - The open standard for parallel programming of heterogeneous systems, Khronos Releases OpenCL 2.1 Provisional Specification for Public Review - Khronos Group Press Release. OpenCL is for doing number crunching on GPU's, using their SIMD capabilities.


Vulkan is still a work in progress, But it can now achieve much greater CPU efficiency than existing OpenGL drivers. Its first version should be finalized later this year.
 
This is outstanding news.

This may well be all it takes for Linux distros to take over the gaming PC market from Windows.
 
Back
Top Bottom