OpenGL 3.0 und GLSL 1.30 Spezifikationen veröffentlicht

Alle News/Neuigkeiten wie neue Applikationen, neue Treiber,... kommen hier rein.

Moderator: Moderator

OpenGL 3.0 und GLSL 1.30 Spezifikationen veröffentlicht

Beitragvon ChemicalBrother » 11.08.2008, 20:36

Quelle: Phoronix

Die Khronos Group hat die Spezifikationen für die OpenGL 3.0 API und GLSL 1.30 veröffentlicht. "Khronos Group" ist das derzeitige Konsortium, dass die Entwicklung von OpenGL und ähnlichem (GLSL, OpenCL, etc.) leitet und vorantreibt.

Wer Lust hat, kann sich die Spezifikationen hier herunterladen. OpenGL 3.0 und GLSL 1.30 zusammen sind etwas über 600 Seiten in pdf-Form.

Von jetzt an heißt es darauf zu warten, bis die Unterstützung bei den Grafikkartenherstellern angekommen ist. Intel soll bisher mit der Integration noch nicht begonnen haben, AMD/ATI hat sich dazu noch nicht geäußert und Nvidia soll gerüchteweise mit einem "Big Bang II" mit einer OpenGL 3.0 Unterstützung kommen.



Kommentar: Wer wissen will, was OpenGL alles mit sich bringt, kann das im Phoronix-Artikel nachlesen. Oder in den Spezifikationen. Ich für meinen Teil verstehe davon nicht wirklich was. :)
Benutzeravatar
ChemicalBrother
 
Beiträge: 5332
Registriert: 21.08.2006, 21:17
Wohnort: Langen
Lizenz: CC
Distribution: Arch Linux
Kernelversion: 6.0

Beitragvon Tettsch » 12.08.2008, 17:39

Bei [url=http://www.pro-linux.de/news/2008/13038.html]Pro-Linux[/url] sind die Kommentare zu OpenGL 3.0 ja eher verhalten, von wegen es würde nicht im Entferntesten an dx 10 rankommen.

Weiß jemand was mit OpenGL 3.0 grafiktechnisch möglich ist? Also vor allem auf Spiele bezogen.

Bei Pro-Linux steht nämlich zwar viel Negatives, aber die Argumente fehlen. Ich hab das Gefühl, dass da wieder nur rumgeflamt wird, ohne jegliches Wissen, aber man weiß ja nie.
Tettsch
 
Beiträge: 72
Registriert: 13.05.2007, 11:20

Beitragvon ChemicalBrother » 12.08.2008, 18:34

Verstehen können das wohl nur die Entwickler, die mit OpenGL arbeiten und bisher habe ich nichts Positives gelesen. Es heißt wohl, dass OpenGL wesentlich mehr versprochen als eingehalten hat und profitiert haben nur die CAD-Anwender. Zocker bleiben wohl auf der Strecke.
Benutzeravatar
ChemicalBrother
 
Beiträge: 5332
Registriert: 21.08.2006, 21:17
Wohnort: Langen
Lizenz: CC
Distribution: Arch Linux
Kernelversion: 6.0

Beitragvon Whistle » 12.08.2008, 19:56

So wie ich das Verstanden habe wurde zum größten Teil nur der Code aufgeräumt. Teile der API wurden als veraltet eingestuft aber noch nicht entfernt um abwärts kompatibel zu sein.
Es wäre schön gewesen wenn mehr dabei rum gekommen wäre. Features die Developer davon überzeugen OpenGL zu verwenden.
Benutzeravatar
Whistle
 
Beiträge: 801
Registriert: 02.11.2006, 22:34

Beitragvon LinuxDonald » 12.08.2008, 23:11

Ab warten was die Spielfirmen sagen die auf OGL setzen ala ID Software
Arch Linux bietet alles was das Linux Herz begehrt :)

Bild
Benutzeravatar
LinuxDonald
 
Beiträge: 831
Registriert: 23.03.2008, 16:34
Wohnort: Tangermünde
Lizenz: CC
Distribution: Arch Linux
Kernelversion: 4.10.8

Beitragvon ChemicalBrother » 12.08.2008, 23:17

[quote=LinuxDonald,index.php?page=Thread&postID=30483#post30483]Ab warten was die Spielfirmen sagen die auf OGL setzen ala ID Software[/quote]

id Software setzt inzwischen auf DirectX, siehe Rage. Weiß jetzt aber nicht, wie sich das bei id Software weiterentwickelt.
Benutzeravatar
ChemicalBrother
 
Beiträge: 5332
Registriert: 21.08.2006, 21:17
Wohnort: Langen
Lizenz: CC
Distribution: Arch Linux
Kernelversion: 6.0

Beitragvon LinuxDonald » 13.08.2008, 01:14

Sie setzen aber nicht nur auf DX der Linux Support ist ja nicht tot bei ID Software auch wenn sie sagen da ihnen Linux Support nicht mehr so wichtig ist. Also Supporten sie auch OGL und auch wegen MAC Versionen.
Arch Linux bietet alles was das Linux Herz begehrt :)

Bild
Benutzeravatar
LinuxDonald
 
Beiträge: 831
Registriert: 23.03.2008, 16:34
Wohnort: Tangermünde
Lizenz: CC
Distribution: Arch Linux
Kernelversion: 4.10.8

Beitragvon q00 » 13.08.2008, 18:43

Ich kenn mich da zwar nicht aus, aber Golem hat da viel positives drüber geschrieben: http://www.golem.de/0808/61667.html
Solch eine riesen Enttäuschung wird es wohl nicht sein. :D
<void> "bei dem täter sichergestellte computer enthielten sogenannte freie software. laut experten handelt es sich dabei um kommunistisches propagandamaterial" (german-bash.org)
Benutzeravatar
q00
 
Beiträge: 96
Registriert: 07.07.2007, 12:52
Wohnort: Bremen

Beitragvon Mark » 13.08.2008, 21:12

Hi
ich hab diesen Forenpost gefunden. Der scheint ziemlich interessant zu sein, hab ihn aber noch nicht ganz durch:
What happened to Longs Peak?

In January 2008 the ARB decided to change directions. At that point it had become clear that doing Longs Peak, although a great effort, wasnt going to happen. We ran into details that we couldnt resolve cleanly in a timely manner. For example, state objects. The idea there is that of all state is immutable. But when we were deciding where to put some of the sample ops state, we ran into issues. If the alpha test is immutable, is the alpha ref value also? If we do so, what does this mean to a developer? How many (100s?) of objects does a developer need to manage? Should we split sample ops state into more than one object? Those kind of issues were taking a lot of time to decide.

Furthermore, the "opt in" method in Longs Peak to move an existing application forward has its pros and cons. The model of creating another context to write Longs Peak code in is very clean. Itll work great for anyone who doesnt have a large code base that they want to move forward incrementally. I suspect that that is most of the developers that are active in this forum. However, there are a class of developers for which this would have been a, potentially very large, burden. This clearly is a controversial topic, and has its share of proponents and opponents.

While we were discussing this, the clock didnt stop ticking. The OpenGL API *has to* provide access to the latest graphics hardware features. OpenGL wasnt doing that anymore in a timely manner. OpenGL was behind in features. All graphics hardware vendors have been shipping hardware with many more features available than OpenGL was exposing. Yes, vendor specific extensions were and are available to fill the gap, but that is not the same as having a core API including those new features. An API that does not expose hardware capabilities is a dead API.

Thus, prioritization was needed, and we made several decisons.

1) We set a goal of exposing hardware functionality of the latest generations of hardware by this Siggraph. Hence, the OpenGL 3.0 and GLSL 1.30 API you guys all seem to love

2) We decided on a formal mechanism to remove functionality from the API. We fully realize that the existing API has been around for a long time, has cruft and is inconsistent with its treatment of objects (how many object models are in the OpenGL 3.0 spec? You count). In its shortest form, removing functionality is a two-step process. First, functionality will be marked "deprecated" in the specification. A long list of functionality is already marked deprecated in the OpenGL 3.0 spec. Second, a future revision of the core spec will actually remove the deprecated functionality. After that, the ARB has options. It can decide to do a third step, and fold some of the removed functionality into a profile. Profiles are optional to implement (more below) and its functionality might still be very important to a sub-set of the OpenGL market. Note that we also decided that new functionality does not have to, and will likely not work with, deprecated functionality. That will make the spec easier to write, read and understand, and drivers easier to implement.

3) We decided to provide a way to create a forward-compatible context. That is an OpenGL 3.0 context with all deprecated features removed. Giving you, as a developer, a preview of what a next version of OpenGL might look like. Drivers can take advantage of this, and might be able to optimize certain code paths in the forward-compatible context only. This is described in the WGL_ARB_create_context extension spec.

4) We decided to have a formal way of defining profiles. During the Longs Peak design phase, we ran into disagreement over what features to remove from the API. Longs Peak removed quite a lot of features as you might remember. Not coincidentally, most of those features are marked deprecated in OpenGL 3.0. The disagreements happened because of different market needs. For some markets a feature is essential, and removing it will cause issues, whereas for another market it is not. We discovered we couldnt do one API to serve all. A profile encapsulates functionality needed to meet the needs of a particular market. Conformant OpenGL products may implement one or more profiles. A profile is by definition a subset of the whole core specification. The core OpenGL specification will contain all functionality, including what is in a profile, in a coherently designed whole. Profiles simply enable products for certain markets to not ship functionality that is not relevant to those markets in a well defined way. Only the ARB may define profiles, individual vendors may not (this in contrast to extensions).

5) We will keep working on object model issues. Yes, this work has been put on the back burner to get OpenGL 3.0 done, but we have picked that work up again. One of the early results of this is that we will work on folding object model improvements into the core in a more incremental manner.

6) We decided to provide functionality, where possible, as extensions to OpenGL 2.1. Any OpenGL 3.0 feature that does not require OpenGL 3.0 hardware is also available in extension form to OpenGL 2.1. The idea here is that new functionality on older hardware enables software vendors to provide upgrades to their existing users.

7) We decided that OpenGL is not going to evolve into a general GPU compute API. In the last two years or so compute using a GPU and a CPU has taken off, in fact is exploding. Khronos has recognized this and is on a fast track to define and release OpenCL, the open standard for compute programming. OpenGL and OpenCL will be able to share data, like buffer objects, in an efficient manner.

There are many good ideas in Longs Peak. They are not lost. We would be stupid to ignore it. We spent almost two years on it, and a lot of good stuff was designed. There is a desire to work on object model issues in the ARB, and we recently started doing that again. Did you know that you have no guarantee that if you change properties of a texture or render buffer attached to a framebuffer object that the framebuffer object will actually notice? It has to notice it, otherwise your next rendering command will not work. Each vendors implementation deals with this case a bit differently. If you throw in multiple contexts in the mix, this becomes an even more interesting issue. The ARB wants to do object model improvements right the first time. We cant afford to do it wrong. At the same time, the ARB will work on exposing new hardware functionality in a timely manner.

I want to ask you to take a deep breath, let this all sink in a bit, and then open up the OpenGL 3.0 and GLSL 1.30 specifications we just posted that have all new stuff clearly marked. Hopefully youll agree with me that theres quite a lot of new stuff to be excited about.

http://www.opengl.org/registry/doc/glspec30.20080811.withchanges.pdf
http://www.opengl.org/registry/doc/GLSLangSpec.Full.1.30.08.withchanges.pdf

This is certainly not the end of the OpenGL API. OpenGL will evolve and will become better with every new revision. I welcome constructive feedback.

Regards,
Barthold Lichtenbelt
OpenGL ARB Working Group chair

[url=http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=243307&#Post243307]Quelle[/url]
http://www.traber-wintergarten.de/
Benutzeravatar
Mark
 
Beiträge: 746
Registriert: 30.09.2006, 01:23
Wohnort: Fulda
Lizenz: GPL
Distribution: Ubuntu


Zurück zu Neuigkeiten

Wer ist online?

Mitglieder in diesem Forum: FaceBook [Linkcheck] und 4 Gäste

cron