Merge remote-tracking branch 'origin/Ghidra_9.2'

This commit is contained in:
ghidra1 2020-11-10 13:56:03 -05:00
commit b4195250f9
19 changed files with 554 additions and 124 deletions

View File

@ -53,30 +53,36 @@
contributions. Thanks to all those who have contributed their time, thoughts, and code. The Ghidra user community
thanks you too!</P>
<P>NOTE: The Ghidra 9.0 server is compatible with Ghidra 9.x clients, however starting with 9.1 the server
<P>NOTE: Ghidra Server: The Ghidra 9.0 server is compatible with Ghidra 9.x clients, however starting with 9.1 the server
requires clients to use a TLS secure connection for the initial RMI registry port access.
If the Ghidra multi-user server is upgraded to 9.2, then all clients must
upgrade to 9.2. A 9.x Ghidra client will fall back to a non-TLS connection when accessing the RMI Registry on
a 9.0 server. Note that all other server interaction including authentication were and continue to be
performed over a secure TLS connection.</P>
<P>Minor Note: If a processors instruction implementation has changed significantly, any generated .fidb files using that
<P>Minor Note: FIDB Files: If a processors instruction implementation has changed significantly, any generated .fidb files using that
processor definition may need to be regenerated.
Changes that could require regeneration include, change in instruction size, number of operands, the nature of
the operands, changes in register decoding for an operand. The x86-64bit has had such changes, for example there were
changes to the decoded register for many instructions with prefix byte overrides. All the provided .fidb files have
been regenerated, and new ones for VS 2017/2019 have been added.</P>
<P>Minor Note: Ghidra-compiled .sla files are not always backwards compatible due to changes in the underlying .sla
<P>Minor Note: SLA Files: Ghidra-compiled .sla files are not always backwards compatible due to changes in the underlying .sla
specification. In the pre-built ghidra, all .sla files are re-built from scratch. However if you have local processor modules,
or are building ghidra from scratch, you may need to do a clean build. Any processor modules with changes are normally recompiled
at Ghidra startup so this situation is rare.</P>
<P>Minor Note: AARCH64 Long: The size of a <b>long</b> on the AARCH64 has been changed from 4-bytes to 8-bytes in the data organization within the
compiler specification. This change could have ramifications in existing AARCH64 programs using a <b>long</b> within data structures or
custom storage of function parameters (dynamic storage should not be an issue). An included script <i><b>FixupCompositeDataTypesScript</b></i>
can be run on programs, only with <i>exclusive checkout</i> in Multi-User, where the datatype sizes for <b>long</b> has changed. This general script can be used
whenever a program's base datatypes have changed in the compiler specification, which should be rare occurence.</P>
<H2>Open Source Based Graphing</H2>
<P>Ghidra has been integrated with an open source graph visualization package, called JUNGGRAPHT, to display interactive
block graphs, call graphs, AST control flow graphs, as well as a general API to create graphs within plugins and scripts.
Prior to public release graphing had been provided by a legacy graphing package which was un-releasable publicly due to
Prior to initial public release, graphing had been provided by a legacy graphing package which was un-releasable publicly due to
licensing issues.</P>
<P>Graphs are displayed in a new tabbed graph window. Current location and selection of vertices are kept in sync with other
@ -84,11 +90,23 @@
layout algorithms to examine the program structure. In addition, Graphs can be exported in several standard graph formats, such as
CSV, GRAPHML, GML, JSON, and VISIO. The exported file can then be imported into external tools.</P>
<P>The graphing capability is implemented by a general service mechanism allowing other graph providers to be implemented to support
a favorite graphing tool, however, users will most likely be satisfied with the new default implementation.
There will be a follow up capabilities such as graph specific popup actions on the the nodes and edges that can be added by
<P>The graphing capability is implemented by a general service mechanism allowing other graph providers to be implemented
to support a favorite graphing tool, however, users will most likely be satisfied with the new default implementation.
There will be follow up capabilities such as graph specific popup actions on the the nodes and edges that can be added by
the creator of the graph before display. As in everything, the Ghidra team is interested in any feedback you might provide
on this new capability.</P>
<H2>JAVA based Univeral PDB Reader/Analzyer/Loader</H2>
<P>Added a new platform-independent PDB Reader/Analyzer/Loader that has the ability to process
raw PDB files and apply extracted information to a program. Written in Java, PDBs can be utilized on any supported
platform, not just on Windows as in prior Ghidra versions. PDBs can be applied during analysis
or by loading and applying the PDB before analysis. Information from PDBs can be force-loaded into a program
with a mismatched PDB signature, which is very useful for extracting data types to be used with the
program from a PDB related to that program. Loading the PDB utilizes a new underlying Universal
Reader API.</P>
<P>The PDB Reader and Analyzer capabilities are an evolutionary development and are expected to be
expanded in future releases. We expect to improve this feature over time, adding to its capabilities
and fixing bugs. If the new PDB Analyzer causes issues, you can turn it off and use the original PDB Analyzer.</P>
<H2>Dynamic Modules: OSGI model for scripting</H2>
<P>A change to scripting brings a powerful form of dynamic extensibilty to Ghidra scripting, where Java source code is (re)compiled, loaded, and
@ -96,25 +114,65 @@
up code into modules. To support modularity while preserving the dynamic nature of scripts, Ghidra uses OSGi. The new feature
provides better script change detection, external jar dependencies, script lifecycle management, and modularity.</P>
<P>To find out more, bring up Help contents in Ghidra, and search for OSGi or Bundles.</P>
<H2>JAVA based Univeral PDB</H2>
<P>Automatically applies. Extract data types. Mismatched PDB's. More to come.</P>
<P>Reader API</P>
<H2>Decompiler</H2>
<P>There have been numerous changes to the decompiler addressing quality, readability, and usability. Decompilation has been improved by:
<ul style="padding-left:80px">
<li>Fewer Casts - The decompiler can better recognize lower precision operations performed with bigger registers, allowing it to eliminate
extraneous casts and concatenations.</li>
<li>Better Strings - All the alternate string formats and encodings recognized by Ghidra are now displayed properly by the decompiler,
and string references contained inside larger strings are better recognized</li>
<li>Controllable Namespace Info - Namespace information, as configured by the user, can now be displayed as part of rendering symbols in decompiler output.
The default minimal display configuration will print only the minimal number of path
elements necessary to uniquely resolve the symbol within the current scope.</li>
<li>Arrays - Analysis of array expressions in the decompiler has improved, simplifying many new optimized array access forms.</li>
</P>
</ul>
<P>The decompiler GUI as also been enhanced with the addition of multiple highlights of varying color, called secondary highlights. In addition,
the Decompiler's Auto Create/Fill Structure commands incorporate data-type information from function prototypes
and will override undefined or more general data-types with discovered data-types that are more specific.</P>
<P>There is re-writen more comprehensive Decompiler documentation too!</P>
<H2>Performance Improvements</H2>
<P>There have been major performance improvements in both analysis and the display or filtering of information within GUI components.
These changes are most notable on large binaries, with reports of improvements from 24 plus hours to under an hour for analysis. Some operations
were done very inefficiently such that the end user might give up on analysis. Please report if you notice any severe performance issues
or binaries that take a large amount of time to process, if you can find an example binary that is easily obtainable that reproduces
the issue we can identify the root cause and hopefully improve it. There are some continued sore performance areas we are still working
or binaries that take a large amount of time to process. If you can find an example binary that is easily obtainable that reproduces
the issue, the root cause can be identified and hopefully improved. There are some continued sore performance areas we are still working
such as the non-returning function analyzer. We hope you will find the binary analysis speed and interactivity much improved.</P>
<P>Some specific areas of improvement are binaries with rich data type information, RTTI information, Exception records, large number
<P>Some specific areas of improvement are binaries with rich data type information, RTTI information, exception records, large number
of bytes, large number of defined symbols, and many symbols at a single address.</P>
<H2>Function Identification Improvements</H2>
<P>Function Identification databases have been re-created from scratch, including new information for Visual Studio 2017 and 2019 libraries.
The databases have been cleaned and should overall result in more matches with fewer mis-matched or multiple matches for identified functions.
In addition the FID libraries had to be rebuilt from scratch due to errors or differences in instruction set decode (especially in the 64-bit X86)
with prior versions of Ghidra. The FID is sensitive to the actual instruction bytes, the mnemonic, register, and number of operands.</P>
<P>There are several new improvements that have been identified that will be added in a future release. Until then to get an even better increased
positive match rate, turn on the <i>Shared Return Calls Analyzer</i> option <i>Assume Contiguous Functions Only</i>, and possibly <i>Allow Conditional Jumps</i>.
For normal clean non-heavily optimized, non-malware or obfuscated binaries, these options should cause few issues.</P>
<H2>Symbol Demangling</H2>
<P>Both GNU and Microsoft symbol de-mangling has been greatly improved resulting in fewer unmangled symbols with better function signature recovery.</P>
<H2>Processor Models</H2>
<P>Several new processor specifications have been added, from very old processors to more recent: CP1600, M6809, M8C, RISC-V, V850.</P>
<P>Note: the Elan EM78xxx just missed the 9.2 cutoff, but should appear shortly.</P>
<P>Many improvements and bug fixes have been made to existing processor
specifications: ARM, AARCH64, AVR8, CRC16C, PIC24/30, SH2, SH4, TriCore, X86, XGATE,
6502, 68K, 6805, M6809, 8051, and others. Of note, the AARCH64 has been updated to support all v8.6 spec instructions.
Many improvements have been contributed by the Ghidra
community, while others were discovered and fixed using a currently internal tool which automates fuzzing
of individual instructions against an external emulator or debugger. We hope to put the tool
out in a near term future release.</P>
<H2>Processor Specification</H2>
<P>Minor changes have been made to the Build process of the Sleigh Editor. For those trying to build it from scratch the
instructions are a little clearer and should work correctly. In addition the new popcount operator is supported.
<P>Minor changes have been made to the build process of the Sleigh Editor. For those trying to build it from scratch the
instructions are a little clearer and should work correctly. In addition the new POPCOUNT operator is supported.
For those modifying or studying sleigh processor specifications, who were unaware of the Sleigh Editor, we encourage
you to give it a try. We suggest you install/run the Sleigh Editor in a separate Eclipse installation, possibly the Eclipse
you use with the Ghidra runtime, from the one you are using with the entire Ghidra source code base imported.
@ -124,47 +182,20 @@
the Xtra SleighDevTools project. The plugin integrates with an external disassembler such as binutils, and provides a code browser
field that displays the disassembly from an external disassembler, such as bintutils, at each instruction or undefined byte in the listing.
The only external disassembler integration provided is binutils, however it is possible to add support for additional external disassemblers.
Previously the External Disassembler had trouble with instruction sets which have alternate mode set of instruction
Previously the External Disassembler had trouble with instruction sets which have an alternate mode set of instruction
such as Thumb or MicroMips. The working aide field has new configuration files to feed different options to the external disassembler
to choose the correct alternate encoding set. This also works well with several scripts that also aide in processor development such as
the <i>CompareSleighExternal</i> script.</P>
<P>A new pcode operation POPCOUNT is supported in sleigh processor specifications, this was mainly added to deal with instructions
<P>A new pCode operation POPCOUNT is supported in sleigh processor specifications. POPCOUNT was mainly added to deal with instructions
that needed to compute the parity of an operation.
In addition, the Sleigh compiler error messages have been reworked to be more comprehensible, consistent in format layout, and to provide
correct line numbers as close to the error as possible. In addition, several cases have been caught during compilation that previously would
pass compilation but cause issues during use of the processor.</P>
<H2>Function Identification Improvements</H2>
<P>Function Identification databases have been recreated from scratch, including new information for Visual Studio 2017 and 2019 libraries.
The databases have been cleaned and should overall result in more matches with fewer mis-matched or multiple matches for identified functions.
In addition the FID libraries had to be rebuilt from scratch due to errors or differences in instruction set decode (especially in the 64-bit X86)
with prior versions of Ghidra. The FID is sensitive to the actual instruction bytes, the mnemonic, register, and number of operands.</P>
<P>There are several new improvements that have been identified that will be added in a future release. Until then to get an even better increased
positive match rate to turn on the Shared Return Calls Analyzer option Assume Contiguous Functions Only, and possibly Allow Conditional Jumps.
For normal cleanly non-heavily optimized, non-malware or obfuscated binaries, these options should cause few issues.</P>
<H2>Symbol Demangling</H2>
<P>Both GNU and Microsoft symbol de-mangling has been greatly improved resulting in fewer unmangled symbols and better function signature recovery.</P>
<H2>Decompiler</H2>
<H2>Languages</H2>
<P>Several new processor specifications added, from very old processors to more recent: CP1600, M6809, M8C, RISC-V, V850.</P>
<P>Many improvements and bug fixes have been made to existing processor
specifications: ARM, AARCH64, AVR8, CRC16C, PIC24/30, SH2, SH4, TriCore, X86, XGATE,
6502, 68K, 6805, M6809, 8051, and others. Of note, the AARCH64 has been updated to support all v8.6 spec instructions.
Many of the processor improvements have been contributed by the Ghidra
community. Others were discovered and fixed with a currently internal tool which automates fuzzing
of individual instructions against an external emulator or debugger. We hope to put the tool
out in a near term future release.</P>
<H2>Dynamic Analysis Framework - Debugger</H2>
<P>The debugger is very much still in progress. You may have seen some commits, in the Ghidra github master branch, to get in sync with the debugger.
Stay tuned for more to come soon after the 9.2 release.</P>
<P>The debugger is very much still in progress. You may have seen some commits, in the Ghidra GitHub master branch, to get in sync with the debugger.
Stay tuned for more on the Dynamic Analysis Framework soon after the 9.2 release.</P>
<H2>Bug Fixes and Enhancements</H2>
<P> Numerous other bug fixes and improvements are fully listed in the <a href="ChangeHistory.html">ChangeHistory</a> file.</P>
@ -290,8 +321,8 @@
<H2>Ghidra Released to the Public!</H2>
<P>In case you missed it, in March 2019, a public version of Ghidra was released for the first time. Soon after,
the full buildable source was made available as an open source project on the NSA github page. The response from the Ghidra
Open Source community has been overwhelmingly positive. We welcome contributions from github including bug fixes,
the full buildable source was made available as an open source project on the NSA GitHub page. The response from the Ghidra
Open Source community has been overwhelmingly positive. We welcome contributions from GitHub including bug fixes,
requests, scripts, processor modules, and plugins. </P>
<H2> Bug Fixes and Enhancements</H2>

View File

@ -0,0 +1,47 @@
/* ###
* IP: GHIDRA
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
//
// Fixes up all composite datatypes within the current program to account for
// any changes to primitive datatype sizes or alignment rules as defined
// by the associated data organization.
//
// This script requires exclusive access to the program to avoid the possibilty
// of excessive change conflicts.
//
// This script can be run multiple times without harm
//@category Data Types
import ghidra.app.script.GhidraScript;
import ghidra.program.database.data.DataTypeManagerDB;
public class FixupCompositeDataTypesScript extends GhidraScript {
@Override
protected void run() throws Exception {
if (currentProgram == null) {
return;
}
if (!currentProgram.hasExclusiveAccess()) {
popup("This script requires an exclusive checkout of the program");
return;
}
DataTypeManagerDB dtm = (DataTypeManagerDB) currentProgram.getDataTypeManager();
dtm.fixupComposites(monitor);
}
}

View File

@ -89,6 +89,7 @@ public class AlignAllDataTypesAction extends DockingAction {
"Are you sure you want to align all of the data types in " +
dataTypeManager.getName() +
"?\nBoth structures and unions that are currently unaligned will become aligned.\n" +
"This could cause component offsets to change and datatype sizes to change.\n" +
"Do you want to continue?", "Continue", OptionDialog.WARNING_MESSAGE);
if (result == OptionDialog.CANCEL_OPTION) {
return;
@ -125,7 +126,7 @@ public class AlignAllDataTypesAction extends DockingAction {
private void alignEachStructure(DataTypeManager dataTypeManager,
DataOrganization dataOrganization) {
Iterator<Composite> allComposites = dataTypeManager.getAllComposites();
Iterator<? extends Composite> allComposites = dataTypeManager.getAllComposites();
while (allComposites.hasNext()) {
Composite composite = allComposites.next();
composite.setInternallyAligned(true);

View File

@ -130,7 +130,10 @@ public class SymbolTreeRootNode extends SymbolCategoryNode {
Symbol functionSymbol = searchSymbol.getParentSymbol();
SymbolNode parentKey = SymbolNode.createNode(functionSymbol, program);
GTreeNode functionNode = findFunctionSymbolNode(parentKey, loadChildren, monitor);
return ((SymbolTreeNode) functionNode).findSymbolTreeNode(key, loadChildren, monitor);
if (functionNode != null) {
return ((SymbolTreeNode) functionNode).findSymbolTreeNode(key, loadChildren, monitor);
}
return null;
}
private GTreeNode findFunctionSymbolNode(SymbolNode key, boolean loadChildren,

View File

@ -15,10 +15,11 @@
*/
package ghidra.app.util.bin.format.dwarf4.next;
import static ghidra.program.model.data.DataTypeConflictHandler.ConflictResult.*;
import java.util.*;
import ghidra.program.model.data.*;
import static ghidra.program.model.data.DataTypeConflictHandler.ConflictResult.*;
/**
* This {@link DataTypeConflictHandler conflict handler} attempts to match
@ -185,22 +186,24 @@ class DWARFDataTypeConflictHandler extends DataTypeConflictHandler {
}
private DataTypeComponent getBitfieldByOffsets(Structure full, DataTypeComponent partDTC) {
DataTypeComponent fullDTC = full.getComponentAt(partDTC.getOffset());
if (fullDTC == null || fullDTC.getOffset() != partDTC.getOffset()) {
return null;
}
BitFieldDataType partBF = (BitFieldDataType) partDTC.getDataType();
DataTypeComponent fullDTC = full.getComponentAt(partDTC.getOffset());
if (fullDTC == null) {
return null;
}
int fullNumComp = full.getNumComponents();
for(int fullOrdinal = fullDTC.getOrdinal(); fullOrdinal < fullNumComp; fullOrdinal++) {
fullDTC = full.getComponent(fullOrdinal);
if (fullDTC.getOffset() != partDTC.getOffset()
|| !(fullDTC.getDataType() instanceof BitFieldDataType)) {
return null;
if (!(fullDTC.getDataType() instanceof BitFieldDataType) ||
fullDTC.getOffset() > partDTC.getOffset()) {
break;
}
BitFieldDataType fullBF = (BitFieldDataType) fullDTC.getDataType();
if ( fullBF.getBitOffset() == partBF.getBitOffset() ) {
if (fullDTC.getOffset() == partDTC.getOffset() &&
fullBF.getBitOffset() == partBF.getBitOffset() &&
fullBF.getBitSize() == partBF.getBitSize()) {
return fullDTC;
}
}
@ -339,7 +342,7 @@ class DWARFDataTypeConflictHandler extends DataTypeConflictHandler {
private long getDTPairKey(DataType dataType1, DataType dataType2) {
return ((long) System.identityHashCode(dataType1) << 32)
+ ((long) System.identityHashCode(dataType2) & 0xffffffffL);
+ (System.identityHashCode(dataType2) & 0xffffffffL);
}
private boolean addVisited(DataType dataType1, DataType dataType2, Set<Long> visitedDataTypes) {

View File

@ -0,0 +1,37 @@
/* ###
* IP: GHIDRA
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package ghidra.app.decompiler.component;
/**
* A poorly designed interface that does not correctly allow for modifying highlights
*/
@Deprecated // this will be removed after 9.2
public interface DecompilerHighlightService {
/**
* Returns the layout model of the Decompiler
* @return the layout model
*/
@Deprecated
public ClangLayoutController getLayoutModel();
/**
* Clears the <b>primary</b> highlights in the Decompiler
*/
@Deprecated
public void clearHighlights();
}

View File

@ -20,6 +20,7 @@ import java.util.*;
import org.jdom.Element;
import ghidra.app.CorePluginPackage;
import ghidra.app.decompiler.component.DecompilerHighlightService;
import ghidra.app.decompiler.component.hover.DecompilerHoverService;
import ghidra.app.events.*;
import ghidra.app.plugin.PluginCategoryNames;
@ -47,7 +48,8 @@ import ghidra.util.task.SwingUpdateManager;
servicesRequired = {
GoToService.class, NavigationHistoryService.class, ClipboardService.class,
DataTypeManagerService.class /*, ProgramManager.class */
},
},
servicesProvided = { DecompilerHighlightService.class },
eventsConsumed = {
ProgramActivatedPluginEvent.class, ProgramOpenedPluginEvent.class,
ProgramLocationPluginEvent.class, ProgramSelectionPluginEvent.class,
@ -81,6 +83,12 @@ public class DecompilePlugin extends Plugin {
disconnectedProviders = new ArrayList<>();
connectedProvider = new PrimaryDecompilerProvider(this);
registerServices();
}
private void registerServices() {
registerServiceProvided(DecompilerHighlightService.class, connectedProvider);
}
@Override

View File

@ -55,7 +55,8 @@ import resources.ResourceManager;
import utility.function.Callback;
public class DecompilerProvider extends NavigatableComponentProviderAdapter
implements DomainObjectListener, OptionsChangeListener, DecompilerCallbackHandler {
implements DomainObjectListener, OptionsChangeListener, DecompilerCallbackHandler,
DecompilerHighlightService {
final static String OPTIONS_TITLE = "Decompiler";
private static Icon REFRESH_ICON = Icons.REFRESH_ICON;
@ -1029,18 +1030,29 @@ public class DecompilerProvider extends NavigatableComponentProviderAdapter
}
@Override
public void removeHighlightProvider(HighlightProvider highlightProvider, Program program2) {
public void removeHighlightProvider(HighlightProvider highlightProvider, Program p) {
// currently unsupported
}
@Override
public void setHighlightProvider(HighlightProvider highlightProvider, Program program2) {
public void setHighlightProvider(HighlightProvider highlightProvider, Program p) {
// currently unsupported
}
public void programClosed(Program closedProgram) {
controller.programClosed(closedProgram);
}
@Deprecated // to be removed post 9.2; replace with an API to manipulate primary highlights
@Override
public ClangLayoutController getLayoutModel() {
return (ClangLayoutController) getDecompilerPanel().getLayoutModel();
}
@Deprecated // to be removed post 9.2; replace with an API to manipulate primary highlights
@Override
public void clearHighlights() {
getDecompilerPanel().clearPrimaryHighlights();
}
}

View File

@ -106,6 +106,8 @@ class LayoutTransitionManager {
}
if (layoutAlgorithm instanceof TreeLayout) {
((TreeLayout<AttributedVertex>) layoutAlgorithm).setRootPredicate(rootPredicate);
layoutAlgorithm.setAfter(new PostProcessRunnable<>(
visualizationServer.getVisualizationModel().getLayoutModel()));
}
// remove any previously added layout paintables
removePaintable(radialLayoutRings);
@ -127,7 +129,9 @@ class LayoutTransitionManager {
if (layoutAlgorithm instanceof EdgeSorting) {
((EdgeSorting<AttributedEdge>) layoutAlgorithm).setEdgeComparator(edgeComparator);
}
LayoutAlgorithmTransition.apply(visualizationServer, layoutAlgorithm);
LayoutAlgorithmTransition.apply(visualizationServer,
layoutAlgorithm,
new PostProcessRunnable<>(visualizationServer.getVisualizationModel().getLayoutModel()));
}
private void removePaintable(VisualizationServer.Paintable paintable) {
@ -146,6 +150,9 @@ class LayoutTransitionManager {
.setRootPredicate(rootPredicate);
((TreeLayout<AttributedVertex>) initialLayoutAlgorithm)
.setVertexBoundsFunction(vertexBoundsFunction);
initialLayoutAlgorithm.setAfter(new PostProcessRunnable<>(
visualizationServer.getVisualizationModel().getLayoutModel()));
}
if (initialLayoutAlgorithm instanceof EdgeSorting) {
((EdgeSorting<AttributedEdge>) initialLayoutAlgorithm)

View File

@ -0,0 +1,80 @@
/* ###
* IP: GHIDRA
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package ghidra.graph.visualization;
import java.util.*;
import org.jgrapht.Graph;
import org.jungrapht.visualization.layout.model.LayoutModel;
/**
* to post-process tree layouts to move vertices that overlap a vertical edge that
* is not incident on the vertex.
* This can be removed after jungrapht-layout-1.1
* @param <V> vertex type
* @param <E> edge type
*/
public class PostProcessRunnable<V, E> implements Runnable {
LayoutModel<V> layoutModel;
public PostProcessRunnable(LayoutModel<V> layoutModel) {
this.layoutModel = layoutModel;
}
@Override
public void run() {
moveVerticesThatOverlapVerticalEdges(layoutModel);
}
protected int moveVerticesThatOverlapVerticalEdges(LayoutModel<V> layoutModel) {
int offset = 100;
int moved = 0;
Graph<V, E> graph = layoutModel.getGraph();
Map<Double, Set<E>> verticalEdgeMap = new LinkedHashMap<>();
graph.edgeSet()
.stream()
.filter(e -> layoutModel.apply(graph.getEdgeSource(e)).x == layoutModel
.apply(graph.getEdgeTarget(e)).x)
.forEach(e -> verticalEdgeMap
.computeIfAbsent(layoutModel.apply(graph.getEdgeSource(e)).x,
k -> new HashSet<>())
.add(e));
for (V v : graph.vertexSet()) {
double x = layoutModel.apply(v).x;
for (E edge : verticalEdgeMap.getOrDefault(x, Collections.emptySet())) {
V source = graph.getEdgeSource(edge);
V target = graph.getEdgeTarget(edge);
if (!v.equals(source) && !v.equals(target)) {
double lowy = layoutModel.apply(source).y;
double hiy = layoutModel.apply(target).y;
if (lowy > hiy) {
double temp = lowy;
lowy = hiy;
hiy = temp;
}
double vy = layoutModel.apply(v).y;
if (lowy <= vy && vy <= hiy) {
layoutModel.set(v, layoutModel.apply(v).add(offset, 0));
moved++;
}
}
}
}
return moved;
}
}

View File

@ -61,7 +61,7 @@ jungrapht.circle.reduceEdgeCrossingMaxEdges=200
jungrapht.initialDimensionVertexDensity=0.3f
jungrapht.minScale=0.001
jungrapht.maxScale=1.0
jungrapht.maxScale=4.0
# not using spatial data structures for vertices at this time. May remove after jungrapht 1.1
jungrapht.vertexSpatialSupport=NONE

View File

@ -24,7 +24,6 @@ import ghidra.app.util.bin.format.pdb.PdbParser.PdbXmlMember;
import ghidra.app.util.importer.MessageLog;
import ghidra.graph.*;
import ghidra.graph.algo.GraphNavigator;
import ghidra.graph.jung.JungDirectedGraph;
import ghidra.program.model.data.Composite;
import ghidra.program.model.data.DataType;
import ghidra.program.model.symbol.SymbolUtilities;
@ -67,8 +66,8 @@ public class ApplyDataTypes {
private List<CompositeDefinition> getCompositeDefinitionsInPostDependencyOrder(
TaskMonitor monitor) {
JungDirectedGraph<CompositeDefinition, GEdge<CompositeDefinition>> graph =
new JungDirectedGraph<>();
GDirectedGraph<CompositeDefinition, GEdge<CompositeDefinition>> graph =
GraphFactory.createDirectedGraph();
for (CompositeDefinition compositeDefinition : compositeQueue.values()) {
graph.addVertex(compositeDefinition);
for (PdbMember m : compositeDefinition.memberList) {

View File

@ -69,26 +69,20 @@ abstract class CompositeDB extends DataTypeDB implements Composite {
protected abstract void initialize();
/**
* Get the preferred length for a new component. For Unions and internally
* aligned structures the preferred component length for a fixed-length dataType
* will be the length of that dataType. Otherwise the length returned will be no
* larger than the specified length.
*
* Get the preferred length for a new component. Constraining length of fixed-length datatype
* may not be sustainable in response to datatype size changes over time.
* @param dataType new component datatype
* @param length constrained length or -1 to force use of dataType size.
* Dynamic types such as string must have a positive length
* specified.
* @param length specified length required for Dynamic types such as string
* which must have a positive length specified.
* @return preferred component length
*/
protected int getPreferredComponentLength(DataType dataType, int length) {
if ((isInternallyAligned() || (this instanceof Union)) && !(dataType instanceof Dynamic)) {
length = -1; // force use of datatype size
if (length > 0 && (dataType instanceof Composite) &&
((Composite) dataType).isNotYetDefined()) {
return length;
}
int dtLength = dataType.getLength();
if (length <= 0) {
length = dtLength;
}
else if (dtLength > 0 && dtLength < length) {
if (dtLength > 0) {
length = dtLength;
}
if (length <= 0) {
@ -789,4 +783,13 @@ abstract class CompositeDB extends DataTypeDB implements Composite {
}
return " pack(" + packingValue + ")";
}
/**
* Perform any neccessary component adjustments based on
* sizes and alignment of components differing from their
* specification which may be influenced by the data organization.
* If this composite changes parents will not be
* notified - handling this is the caller's responsibility.
*/
protected abstract void fixupComponents();
}

View File

@ -28,6 +28,8 @@ import generic.jar.ResourceFile;
import ghidra.app.plugin.core.datamgr.archive.BuiltInSourceArchive;
import ghidra.framework.store.db.PackedDBHandle;
import ghidra.framework.store.db.PackedDatabase;
import ghidra.graph.*;
import ghidra.graph.algo.GraphNavigator;
import ghidra.program.database.*;
import ghidra.program.database.map.AddressMap;
import ghidra.program.model.address.Address;
@ -2711,7 +2713,7 @@ abstract public class DataTypeManagerDB implements DataTypeManager {
class StructureIterator implements Iterator<Structure> {
private RecordIterator it;
private Structure nextStruct;
private StructureDB nextStruct;
StructureIterator() throws IOException {
it = compositeAdapter.getRecords();
@ -2731,9 +2733,9 @@ abstract public class DataTypeManagerDB implements DataTypeManager {
}
@Override
public Structure next() {
public StructureDB next() {
if (hasNext()) {
Structure s = nextStruct;
StructureDB s = nextStruct;
nextStruct = null;
return s;
}
@ -2746,7 +2748,7 @@ abstract public class DataTypeManagerDB implements DataTypeManager {
Record rec = it.next();
DataType dt = getDataType(rec.getKey(), rec);
if (dt instanceof Structure) {
nextStruct = (Structure) dt;
nextStruct = (StructureDB) dt;
return;
}
}
@ -2759,7 +2761,7 @@ abstract public class DataTypeManagerDB implements DataTypeManager {
class CompositeIterator implements Iterator<Composite> {
private RecordIterator it;
private Composite nextComposite;
private CompositeDB nextComposite;
CompositeIterator() throws IOException {
it = compositeAdapter.getRecords();
@ -2779,9 +2781,9 @@ abstract public class DataTypeManagerDB implements DataTypeManager {
}
@Override
public Composite next() {
public CompositeDB next() {
if (hasNext()) {
Composite c = nextComposite;
CompositeDB c = nextComposite;
nextComposite = null;
return c;
}
@ -2792,7 +2794,7 @@ abstract public class DataTypeManagerDB implements DataTypeManager {
try {
if (it.hasNext()) {
Record rec = it.next();
nextComposite = (Composite) getDataType(rec.getKey(), rec);
nextComposite = (CompositeDB) getDataType(rec.getKey(), rec);
}
}
catch (IOException e) {
@ -3787,6 +3789,99 @@ abstract public class DataTypeManagerDB implements DataTypeManager {
}
}
/**
* Fixup all composites and thier components which may be affected by a data organization
* change include primitive type size changes and alignment changes. It is highly recommended
* that this program be open with exclusive access before invoking this method to avoid
* excessive merge conflicts with other users.
* @param monitor task monitor
* @throws CancelledException if operation is cancelled
*/
public void fixupComposites(TaskMonitor monitor) throws CancelledException {
lock.acquire();
try {
// NOTE: Any composite could be indirectly affected by a component size change
// based upon type relationships
// NOTE: Composites brought in from archive may have incorrect component size
// if not aligned and should not be used to guage a primitive size change
// Unfortunately parent table does not track use of primitives so a brute
// force search is required. Since all composites must be checked, this
// is combined with the composite graph generation to get ordered list
// of composites for subsequent size change operation.
List<CompositeDB> orderedComposites = getAllCompositesInPostDependencyOrder(monitor);
monitor.setProgress(0);
monitor.setMaximum(orderedComposites.size());
monitor.setMessage("Updating Datatype Sizes...");
int count = 0;
for (CompositeDB c : orderedComposites) {
monitor.checkCanceled();
c.fixupComponents();
monitor.setProgress(++count);
}
}
finally {
lock.release();
}
}
/**
* Get composite base type which corresponds to a specified datatype.
* Pointers to composites are ignored. This method is intended to be
* used by the {@link #getAllCompositesInPostDependencyOrder} method only.
* @param dt datatype
* @return base datatype if dt corresponds to a composite or array of composites,
* otherwise null is returned
*/
private CompositeDB getCompositeBaseType(DataType dt) {
while ((dt instanceof Array) || (dt instanceof TypeDef)) {
if (dt instanceof Array) {
dt = ((Array) dt).getDataType();
}
else {
dt = ((TypeDef) dt).getBaseDataType();
}
}
return (dt instanceof CompositeDB) ? (CompositeDB) dt : null;
}
/*
* Graph all composites return an ordered list with leaves returned first and detect
* primitve size changes based upon specified primitiveTypeIds. It is assumed TypeDef
* use of primitives have already be handled elsewhere.
* All pointers are ignored and not followed during graph generation.
* This method is intended to facilitate datatype size change propogation in an
* orderly fashion to reduce size change propogation.
* @param monitor task monitor
* @return order list of composites
* @throws CancelledException if task cancelled
*/
private List<CompositeDB> getAllCompositesInPostDependencyOrder(TaskMonitor monitor)
throws CancelledException {
GDirectedGraph<CompositeDB, GEdge<CompositeDB>> graph = GraphFactory.createDirectedGraph();
Iterator<Composite> allComposites = getAllComposites();
while (allComposites.hasNext()) {
monitor.checkCanceled();
CompositeDB c = (CompositeDB) allComposites.next();
graph.addVertex(c);
for (DataTypeComponent m : c.getDefinedComponents()) {
CompositeDB refC = getCompositeBaseType(m.getDataType());
if (refC != null) {
graph.addEdge(new DefaultGEdge<CompositeDB>(c, refC));
}
}
}
return GraphAlgorithms.getVerticesInPostOrder(graph, GraphNavigator.topDownNavigator());
}
/**
* Activate resolveCache and associated resolveQueue if not already active. If
* this method returns true caller is responsible for flushing resolveQueue and

View File

@ -590,8 +590,7 @@ class StructureDB extends CompositeDB implements Structure {
if (equals(dataType)) {
return true;
}
for (int i = 0; i < components.size(); i++) {
DataTypeComponent dtc = components.get(i);
for (DataTypeComponentDB dtc : components) {
DataType subDt = dtc.getDataType();
if (subDt instanceof Composite) {
if (((Composite) subDt).isPartOf(dataType)) {
@ -1198,8 +1197,7 @@ class StructureDB extends CompositeDB implements Structure {
int oldLength = structLength;
int oldMinAlignment = getMinimumAlignment();
for (int i = 0; i < components.size(); i++) {
DataTypeComponentDB dtc = components.get(i);
for (DataTypeComponentDB dtc : components) {
dtc.getDataType().removeParent(this);
componentAdapter.removeRecord(dtc.getKey());
}
@ -1364,37 +1362,44 @@ class StructureDB extends CompositeDB implements Structure {
try {
checkDeleted();
if (isInternallyAligned()) {
adjustInternalAlignment(true);
adjustComponents(true); // notifies parents
return;
}
boolean didChange = false;
boolean warn = false;
int n = components.size();
for (int i = 0; i < n; i++) {
DataTypeComponentDB dtc = components.get(i);
int nextIndex = i + 1;
if (dtc.getDataType() == dt) {
// assume no impact to bitfields since base types
// should not change size
int dtLen = dt.getLength();
int dtcLen = dtc.getLength();
if (dtLen < dtcLen) {
dtc.setLength(dtLen, true);
shiftOffsets(nextIndex, dtcLen - dtLen, 0);
int length = getPreferredComponentLength(dt, dtcLen);
if (length < dtcLen) {
dtc.setLength(length, true);
shiftOffsets(i + 1, dtcLen - length, 0);
didChange = true;
}
else if (dtLen > dtcLen) {
int consumed = consumeBytesAfter(i, dtLen - dtcLen);
else if (length > dtcLen) {
int consumed = consumeBytesAfter(i, length - dtcLen);
if (consumed > 0) {
dtc.updateRecord();
shiftOffsets(nextIndex, -consumed, 0);
shiftOffsets(i + 1, -consumed, 0);
didChange = true;
}
}
if (dtc.getLength() != length) {
warn = true;
}
}
}
if (didChange) {
adjustInternalAlignment(true);
notifySizeChanged();
adjustInternalAlignment(false);
notifySizeChanged(); // notifies parents
}
if (warn) {
Msg.warn(this,
"Failed to resize one or more structure components: " + getPathName());
}
}
finally {
@ -1402,6 +1407,56 @@ class StructureDB extends CompositeDB implements Structure {
}
}
@Override
protected void fixupComponents() {
if (isInternallyAligned()) {
// Do not notify parents
if (adjustComponents(false)) {
dataMgr.dataTypeChanged(this);
}
return;
}
boolean didChange = false;
boolean warn = false;
int n = components.size();
for (int i = 0; i < n; i++) {
DataTypeComponentDB dtc = components.get(i);
DataType dt = dtc.getDataType();
if (dt instanceof BitFieldDataType) {
// TODO: could get messy
continue;
}
int dtcLen = dtc.getLength();
int length = getPreferredComponentLength(dt, dtcLen);
if (dtcLen != length) {
if (length < dtcLen) {
dtc.setLength(length, true);
shiftOffsets(i + 1, dtcLen - length, 0);
didChange = true;
}
else if (length > dtcLen) {
int consumed = consumeBytesAfter(i, length - dtcLen);
if (consumed > 0) {
dtc.updateRecord();
shiftOffsets(i + 1, -consumed, 0);
didChange = true;
}
}
if (dtc.getLength() != length) {
warn = true;
}
}
}
if (didChange) {
// Do not notify parents
adjustInternalAlignment(false);
dataMgr.dataTypeChanged(this);
}
if (warn) {
Msg.warn(this, "Failed to resize one or more structure components: " + getPathName());
}
}
@Override
public void dataTypeAlignmentChanged(DataType dt) {
lock.acquire();
@ -1808,8 +1863,7 @@ class StructureDB extends CompositeDB implements Structure {
flexibleArrayComponent = null;
}
for (int i = 0; i < components.size(); i++) {
DataTypeComponentDB dtc = components.get(i);
for (DataTypeComponentDB dtc : components) {
dtc.getDataType().removeParent(this);
try {
componentAdapter.removeRecord(dtc.getKey());
@ -1885,12 +1939,14 @@ class StructureDB extends CompositeDB implements Structure {
changed |= updateComposite(packResult.numComponents, packResult.structureLength,
packResult.alignment, false);
if (notify & changed) {
if (oldLength != structLength) {
notifySizeChanged();
}
else {
dataMgr.dataTypeChanged(this);
if (changed) {
if (notify) {
if (oldLength != structLength) {
notifySizeChanged();
}
else {
dataMgr.dataTypeChanged(this);
}
}
return true;
}

View File

@ -277,8 +277,7 @@ class UnionDB extends CompositeDB implements Union {
int oldLength = unionLength;
int oldMinAlignment = getMinimumAlignment();
for (int i = 0; i < components.size(); i++) {
DataTypeComponentDB dtc = components.get(i);
for (DataTypeComponentDB dtc : components) {
dtc.getDataType().removeParent(this);
removeComponent(dtc.getKey());
}
@ -432,7 +431,7 @@ class UnionDB extends CompositeDB implements Union {
}
}
if (changed) {
adjustLength(true, false);
adjustLength(true, false); // notifies parents
}
}
finally {
@ -440,6 +439,30 @@ class UnionDB extends CompositeDB implements Union {
}
}
@Override
protected void fixupComponents() {
boolean changed = false;
for (DataTypeComponentDB dtc : components) {
DataType dt = dtc.getDataType();
if (dt instanceof BitFieldDataType) {
dt = adjustBitField(dt); // in case base type changed
}
int dtcLen = dtc.getLength();
int length = getPreferredComponentLength(dt, dtcLen);
if (length != dtcLen) {
dtc.setLength(length, true);
changed = true;
}
}
if (changed || isInternallyAligned()) {
// NOTE: since we do not retain our external alignment we have no way of knowing if
// it has changed, so we must assume it has if we are an aligned union
// Do not notify parents
adjustLength(false, false);
dataMgr.dataTypeChanged(this);
}
}
@Override
public void dataTypeAlignmentChanged(DataType dt) {
adjustInternalAlignment(true);
@ -614,7 +637,9 @@ class UnionDB extends CompositeDB implements Union {
catch (IOException e) {
dataMgr.dbError(e);
}
notifySizeChanged();
if (notify) {
notifySizeChanged();
}
}
else if (notify) {
dataMgr.dataTypeChanged(this);

View File

@ -82,7 +82,7 @@ public interface Program extends DataTypeManagerDomainObject {
* Returns the program's datatype manager.
*/
@Override
public DataTypeManager getDataTypeManager();
public ProgramBasedDataTypeManager getDataTypeManager();
/**
* Returns the programs function manager.

View File

@ -85,6 +85,30 @@ public class StructureDBTest extends AbstractGTest {
return (Pointer) dataMgr.resolve(new Pointer32DataType(dataType), null);
}
@Test
public void testEmpty() throws Exception {
Structure s = new StructureDataType("foo", 0);
assertTrue(s.isNotYetDefined());
assertEquals(0, s.getNumComponents());
assertEquals(0, s.getNumDefinedComponents());
Structure s2 = (Structure) dataMgr.resolve(s, null);
assertTrue(s2.isNotYetDefined());
assertEquals(0, s2.getNumComponents());
assertEquals(0, s.getNumDefinedComponents());
}
@Test
public void testSizeOne() throws Exception {
Structure s = new StructureDataType("foo", 1);
assertFalse(s.isNotYetDefined());
assertEquals(1, s.getNumComponents());
assertEquals(0, s.getNumDefinedComponents());
Structure s2 = (Structure) dataMgr.resolve(s, null);
assertFalse(s2.isNotYetDefined());
assertEquals(1, s2.getNumComponents());
assertEquals(0, s2.getNumDefinedComponents());
}
@Test
public void testAdd() throws Exception {
assertEquals(8, struct.getLength());

View File

@ -9,5 +9,4 @@ LICENSE||GHIDRA||||END|
NOTICE||GHIDRA||||END|
README.md||GHIDRA||||END|
build.gradle||GHIDRA||||END|
ghidra.repos.config||GHIDRA||||END|
settings.gradle||GHIDRA||||END|